Mar 11 09:13:51 crc systemd[1]: Starting Kubernetes Kubelet... Mar 11 09:13:51 crc restorecon[4684]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:51 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 11 09:13:52 crc restorecon[4684]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 11 09:13:52 crc kubenswrapper[4830]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 09:13:52 crc kubenswrapper[4830]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 11 09:13:52 crc kubenswrapper[4830]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 09:13:52 crc kubenswrapper[4830]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 09:13:52 crc kubenswrapper[4830]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 11 09:13:52 crc kubenswrapper[4830]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.661661 4830 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670825 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670928 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670938 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670948 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670957 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670966 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670975 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670984 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.670995 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671034 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671043 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671052 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671060 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671069 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671078 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671086 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671094 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671102 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671110 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671118 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671127 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671135 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671146 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671161 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671172 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671181 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671190 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671198 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671206 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671214 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671222 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671230 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671253 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671261 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671269 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671277 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671285 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671293 4830 feature_gate.go:330] unrecognized feature gate: Example Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671300 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671309 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671316 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671323 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671331 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671338 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671346 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671355 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671362 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671370 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671380 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671389 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671399 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671407 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671415 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671424 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671432 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671441 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671449 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671457 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671464 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671472 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671480 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671487 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671494 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671502 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671510 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671518 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671526 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671533 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671541 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671549 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.671557 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671724 4830 flags.go:64] FLAG: --address="0.0.0.0" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671743 4830 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671769 4830 flags.go:64] FLAG: --anonymous-auth="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671855 4830 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671868 4830 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671878 4830 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671890 4830 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671902 4830 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671911 4830 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671920 4830 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671930 4830 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671941 4830 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671951 4830 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671960 4830 flags.go:64] FLAG: --cgroup-root="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671969 4830 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671978 4830 flags.go:64] FLAG: --client-ca-file="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671989 4830 flags.go:64] FLAG: --cloud-config="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.671997 4830 flags.go:64] FLAG: --cloud-provider="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672006 4830 flags.go:64] FLAG: --cluster-dns="[]" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672041 4830 flags.go:64] FLAG: --cluster-domain="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672050 4830 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672061 4830 flags.go:64] FLAG: --config-dir="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672070 4830 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672080 4830 flags.go:64] FLAG: --container-log-max-files="5" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672091 4830 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672100 4830 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672109 4830 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672119 4830 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672128 4830 flags.go:64] FLAG: --contention-profiling="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672137 4830 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672146 4830 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672156 4830 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672165 4830 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672176 4830 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672185 4830 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672194 4830 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672203 4830 flags.go:64] FLAG: --enable-load-reader="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672212 4830 flags.go:64] FLAG: --enable-server="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672221 4830 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672233 4830 flags.go:64] FLAG: --event-burst="100" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672243 4830 flags.go:64] FLAG: --event-qps="50" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672252 4830 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672261 4830 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672270 4830 flags.go:64] FLAG: --eviction-hard="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672281 4830 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672290 4830 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672299 4830 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672311 4830 flags.go:64] FLAG: --eviction-soft="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672320 4830 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672329 4830 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672338 4830 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672347 4830 flags.go:64] FLAG: --experimental-mounter-path="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672356 4830 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672366 4830 flags.go:64] FLAG: --fail-swap-on="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672375 4830 flags.go:64] FLAG: --feature-gates="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672387 4830 flags.go:64] FLAG: --file-check-frequency="20s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672397 4830 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672406 4830 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672415 4830 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672425 4830 flags.go:64] FLAG: --healthz-port="10248" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672434 4830 flags.go:64] FLAG: --help="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672443 4830 flags.go:64] FLAG: --hostname-override="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672452 4830 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672461 4830 flags.go:64] FLAG: --http-check-frequency="20s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672470 4830 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672479 4830 flags.go:64] FLAG: --image-credential-provider-config="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672488 4830 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672498 4830 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672507 4830 flags.go:64] FLAG: --image-service-endpoint="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672516 4830 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672525 4830 flags.go:64] FLAG: --kube-api-burst="100" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672534 4830 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672544 4830 flags.go:64] FLAG: --kube-api-qps="50" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672554 4830 flags.go:64] FLAG: --kube-reserved="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672563 4830 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672573 4830 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672583 4830 flags.go:64] FLAG: --kubelet-cgroups="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672591 4830 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672601 4830 flags.go:64] FLAG: --lock-file="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672610 4830 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672619 4830 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672629 4830 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672652 4830 flags.go:64] FLAG: --log-json-split-stream="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672665 4830 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672675 4830 flags.go:64] FLAG: --log-text-split-stream="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672684 4830 flags.go:64] FLAG: --logging-format="text" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672693 4830 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672703 4830 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672712 4830 flags.go:64] FLAG: --manifest-url="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672721 4830 flags.go:64] FLAG: --manifest-url-header="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672733 4830 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672742 4830 flags.go:64] FLAG: --max-open-files="1000000" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672752 4830 flags.go:64] FLAG: --max-pods="110" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672761 4830 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672771 4830 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672781 4830 flags.go:64] FLAG: --memory-manager-policy="None" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672789 4830 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672798 4830 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672808 4830 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672818 4830 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672840 4830 flags.go:64] FLAG: --node-status-max-images="50" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672850 4830 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672860 4830 flags.go:64] FLAG: --oom-score-adj="-999" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672869 4830 flags.go:64] FLAG: --pod-cidr="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672879 4830 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672893 4830 flags.go:64] FLAG: --pod-manifest-path="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672902 4830 flags.go:64] FLAG: --pod-max-pids="-1" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672911 4830 flags.go:64] FLAG: --pods-per-core="0" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672920 4830 flags.go:64] FLAG: --port="10250" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672930 4830 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672939 4830 flags.go:64] FLAG: --provider-id="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672948 4830 flags.go:64] FLAG: --qos-reserved="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672957 4830 flags.go:64] FLAG: --read-only-port="10255" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672966 4830 flags.go:64] FLAG: --register-node="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672976 4830 flags.go:64] FLAG: --register-schedulable="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.672985 4830 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673000 4830 flags.go:64] FLAG: --registry-burst="10" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673009 4830 flags.go:64] FLAG: --registry-qps="5" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673043 4830 flags.go:64] FLAG: --reserved-cpus="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673054 4830 flags.go:64] FLAG: --reserved-memory="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673065 4830 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673074 4830 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673083 4830 flags.go:64] FLAG: --rotate-certificates="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673093 4830 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673102 4830 flags.go:64] FLAG: --runonce="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673111 4830 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673120 4830 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673130 4830 flags.go:64] FLAG: --seccomp-default="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673140 4830 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673151 4830 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673161 4830 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.673170 4830 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674121 4830 flags.go:64] FLAG: --storage-driver-password="root" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674138 4830 flags.go:64] FLAG: --storage-driver-secure="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674148 4830 flags.go:64] FLAG: --storage-driver-table="stats" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674158 4830 flags.go:64] FLAG: --storage-driver-user="root" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674168 4830 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674178 4830 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674188 4830 flags.go:64] FLAG: --system-cgroups="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674197 4830 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674215 4830 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674224 4830 flags.go:64] FLAG: --tls-cert-file="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674232 4830 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674244 4830 flags.go:64] FLAG: --tls-min-version="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674255 4830 flags.go:64] FLAG: --tls-private-key-file="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674264 4830 flags.go:64] FLAG: --topology-manager-policy="none" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674273 4830 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674282 4830 flags.go:64] FLAG: --topology-manager-scope="container" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674292 4830 flags.go:64] FLAG: --v="2" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674304 4830 flags.go:64] FLAG: --version="false" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674315 4830 flags.go:64] FLAG: --vmodule="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674327 4830 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.674336 4830 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674578 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674589 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674600 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674611 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674619 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674628 4830 feature_gate.go:330] unrecognized feature gate: Example Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674637 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674647 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674655 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674664 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674672 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674680 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674689 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674697 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674706 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674714 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674722 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674730 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674739 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674750 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674759 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674768 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674777 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674785 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674794 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674802 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674844 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674854 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674863 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674872 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674881 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674890 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674898 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674906 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674914 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674922 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674930 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674940 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674951 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674960 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674969 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674979 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674987 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.674995 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675030 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675039 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675049 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675057 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675068 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675078 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675087 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675096 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675104 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675112 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675119 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675127 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675135 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675143 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675151 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675159 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675167 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675175 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675183 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675190 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675198 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675206 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675213 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675221 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675229 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675236 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.675244 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.675258 4830 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.690099 4830 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.690181 4830 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690360 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690390 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690409 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690424 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690436 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690446 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690456 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690466 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690476 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690485 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690495 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690507 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690517 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690527 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690536 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690546 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690556 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690565 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690575 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690586 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690596 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690625 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690635 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690645 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690655 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690665 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690675 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690684 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690694 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690704 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690714 4830 feature_gate.go:330] unrecognized feature gate: Example Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690723 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690733 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690742 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690757 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690768 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690777 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690787 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690797 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690806 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690816 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690826 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690835 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690845 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690855 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690865 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690875 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690885 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690895 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690908 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690922 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690934 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690948 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690960 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690971 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690982 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.690992 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691002 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691043 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691055 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691064 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691074 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691084 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691096 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691109 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691121 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691131 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691142 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691152 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691163 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691175 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.691192 4830 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691516 4830 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691535 4830 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691547 4830 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691559 4830 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691570 4830 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691584 4830 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691598 4830 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691608 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691619 4830 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691631 4830 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691641 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691652 4830 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691662 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691671 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691681 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691691 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691700 4830 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691711 4830 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691719 4830 feature_gate.go:330] unrecognized feature gate: Example Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691729 4830 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691738 4830 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691749 4830 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691758 4830 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691767 4830 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691780 4830 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691792 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691803 4830 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691814 4830 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691825 4830 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691835 4830 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691845 4830 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691855 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691864 4830 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691876 4830 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691888 4830 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691898 4830 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691909 4830 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691918 4830 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691927 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691936 4830 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691946 4830 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691956 4830 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691965 4830 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691974 4830 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691987 4830 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.691997 4830 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692007 4830 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692076 4830 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692089 4830 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692099 4830 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692108 4830 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692120 4830 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692133 4830 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692143 4830 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692152 4830 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692164 4830 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692175 4830 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692185 4830 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692195 4830 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692204 4830 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692214 4830 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692224 4830 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692234 4830 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692244 4830 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692254 4830 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692263 4830 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692273 4830 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692283 4830 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692292 4830 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692302 4830 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.692314 4830 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.692328 4830 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.692717 4830 server.go:940] "Client rotation is on, will bootstrap in background" Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.698233 4830 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.704082 4830 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.704274 4830 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.706338 4830 server.go:997] "Starting client certificate rotation" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.706389 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.706692 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.735997 4830 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.737947 4830 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.740457 4830 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.761957 4830 log.go:25] "Validated CRI v1 runtime API" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.804068 4830 log.go:25] "Validated CRI v1 image API" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.806692 4830 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.814205 4830 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-11-09-09-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.814252 4830 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.842550 4830 manager.go:217] Machine: {Timestamp:2026-03-11 09:13:52.838757421 +0000 UTC m=+0.619908180 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2fdf51c2-3f48-4868-b069-1f3b038d7ba4 BootID:fae51179-5520-451c-a453-83531ae25f54 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:59:8a:e4 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:59:8a:e4 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:66:7b:92 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5d:a3:79 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2b:21:0d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:4e:e7:70 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:de:78:08:20:91:17 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1a:3d:c1:08:ad:67 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.842961 4830 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.843170 4830 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.846221 4830 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.846583 4830 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.846650 4830 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.847127 4830 topology_manager.go:138] "Creating topology manager with none policy" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.847147 4830 container_manager_linux.go:303] "Creating device plugin manager" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.847824 4830 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.847883 4830 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.848161 4830 state_mem.go:36] "Initialized new in-memory state store" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.848746 4830 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.853169 4830 kubelet.go:418] "Attempting to sync node with API server" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.853215 4830 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.853258 4830 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.853279 4830 kubelet.go:324] "Adding apiserver pod source" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.853299 4830 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.858148 4830 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.859512 4830 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.860382 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.860539 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.860404 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.860986 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.862415 4830 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.863994 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864078 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864093 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864107 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864130 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864143 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864161 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864182 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864198 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864211 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864250 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.864263 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.865529 4830 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.866249 4830 server.go:1280] "Started kubelet" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.868393 4830 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 11 09:13:52 crc systemd[1]: Started Kubernetes Kubelet. Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.868639 4830 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.873808 4830 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.876729 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.877078 4830 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.878396 4830 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.878424 4830 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.878618 4830 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.878624 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.879292 4830 server.go:460] "Adding debug handlers to kubelet server" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.879716 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.879720 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="200ms" Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.880373 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.880438 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.881303 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189bbe96b9744e7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,LastTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.883869 4830 factory.go:153] Registering CRI-O factory Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.883905 4830 factory.go:221] Registration of the crio container factory successfully Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.883993 4830 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.884008 4830 factory.go:55] Registering systemd factory Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.884037 4830 factory.go:221] Registration of the systemd container factory successfully Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.884065 4830 factory.go:103] Registering Raw factory Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.884083 4830 manager.go:1196] Started watching for new ooms in manager Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.886894 4830 manager.go:319] Starting recovery of all containers Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892523 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892592 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892616 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892636 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892656 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892675 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892692 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892712 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892735 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892753 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892771 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892789 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892806 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892828 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892848 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892866 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892887 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892905 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.892959 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893004 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893048 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893067 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893085 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893103 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893121 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893142 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893164 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893183 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893201 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893247 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893265 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893284 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893308 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893327 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893346 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893363 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893382 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893400 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893418 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893439 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893483 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893510 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893528 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893547 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893569 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893587 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893605 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893624 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893644 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893662 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893680 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893700 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893727 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893749 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893770 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893793 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893812 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893830 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893848 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893867 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893885 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893904 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893923 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893942 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893962 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.893980 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894007 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894049 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894067 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894089 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894106 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894124 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894142 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894160 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894179 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894198 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894219 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894238 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894285 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894305 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894325 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894346 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894366 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894386 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894404 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894423 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894441 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894460 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894477 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894496 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894514 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894532 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894552 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894571 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894592 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894610 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894631 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894650 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894668 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894687 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894704 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894725 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894745 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894763 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894795 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894816 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894835 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894854 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894877 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894897 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894917 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894937 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894959 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894977 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.894998 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895041 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895059 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895077 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895097 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895118 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895139 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895159 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895177 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895198 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895218 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895238 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895256 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895276 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895295 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895315 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895333 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895360 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895378 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895396 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895416 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895435 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895453 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895473 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895492 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895511 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895532 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895552 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895571 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895591 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895611 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895629 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895647 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895665 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895683 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895702 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895719 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895738 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895756 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895774 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895792 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895811 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895829 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895848 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895868 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895888 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895907 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895925 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895946 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.895965 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896001 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896056 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896076 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896096 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896115 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896135 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896185 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896203 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896224 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.896242 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897496 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897533 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897558 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897593 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897614 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897636 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897660 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897683 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897703 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897748 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897769 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897791 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897814 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897835 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897856 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897878 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897899 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897921 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897942 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897962 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.897984 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.898005 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.898058 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.898080 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.898102 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.900786 4830 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.900855 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.900888 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.900919 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.900941 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.900962 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.900982 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.901004 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.901063 4830 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.901088 4830 reconstruct.go:97] "Volume reconstruction finished" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.901102 4830 reconciler.go:26] "Reconciler: start to sync state" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.911827 4830 manager.go:324] Recovery completed Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.927777 4830 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.930736 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.931125 4830 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.931164 4830 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.931195 4830 kubelet.go:2335] "Starting kubelet main sync loop" Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.931255 4830 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 11 09:13:52 crc kubenswrapper[4830]: W0311 09:13:52.931951 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.932055 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.932879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.932912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.932921 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.935630 4830 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.935661 4830 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.935685 4830 state_mem.go:36] "Initialized new in-memory state store" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.957221 4830 policy_none.go:49] "None policy: Start" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.959666 4830 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 11 09:13:52 crc kubenswrapper[4830]: I0311 09:13:52.959704 4830 state_mem.go:35] "Initializing new in-memory state store" Mar 11 09:13:52 crc kubenswrapper[4830]: E0311 09:13:52.979181 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.008362 4830 manager.go:334] "Starting Device Plugin manager" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.008424 4830 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.008440 4830 server.go:79] "Starting device plugin registration server" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.008982 4830 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.009004 4830 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.009742 4830 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.009989 4830 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.009997 4830 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.018597 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.031764 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.031936 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.033346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.033382 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.033394 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.033552 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.033977 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.034094 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.034744 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.034782 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.034795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.034965 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.035120 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.035150 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.035718 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.035789 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.035809 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036090 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036248 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036288 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036505 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036540 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036552 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036565 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.036576 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.037195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.037227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.037237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.038574 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.038605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.038616 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.038758 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.038944 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.039006 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.039554 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.039588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.039600 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.039794 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.039837 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.040518 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.040550 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.040561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.040561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.040580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.040588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.080355 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="400ms" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104096 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104142 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104165 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104183 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104205 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104229 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104364 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104446 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104484 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104518 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104567 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104606 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104632 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.104669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.110602 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.112678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.112730 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.112743 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.112780 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.113650 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205732 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205785 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205808 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205836 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205886 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205908 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205931 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.205957 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206054 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206091 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206116 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206150 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206168 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206207 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206217 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206215 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206134 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206310 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206166 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206428 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206441 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206444 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206413 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206479 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.206505 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.314099 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.315756 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.315799 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.315817 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.315851 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.316380 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.360117 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.380272 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.386795 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: W0311 09:13:53.404601 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-36303b84300938df9c26069aad6444097d7ab100bf318e6d8c0b78c1110b58a8 WatchSource:0}: Error finding container 36303b84300938df9c26069aad6444097d7ab100bf318e6d8c0b78c1110b58a8: Status 404 returned error can't find the container with id 36303b84300938df9c26069aad6444097d7ab100bf318e6d8c0b78c1110b58a8 Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.408796 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.417925 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:53 crc kubenswrapper[4830]: W0311 09:13:53.421083 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9669fabc66a612450b7db5fb740200012ea3893a710be1713e895c42208fcc00 WatchSource:0}: Error finding container 9669fabc66a612450b7db5fb740200012ea3893a710be1713e895c42208fcc00: Status 404 returned error can't find the container with id 9669fabc66a612450b7db5fb740200012ea3893a710be1713e895c42208fcc00 Mar 11 09:13:53 crc kubenswrapper[4830]: W0311 09:13:53.431101 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-17d41d46ab7a50fc7677b5f9dac6163f8e6e0bb6174cc6b94cb6ac7a1b59a8ab WatchSource:0}: Error finding container 17d41d46ab7a50fc7677b5f9dac6163f8e6e0bb6174cc6b94cb6ac7a1b59a8ab: Status 404 returned error can't find the container with id 17d41d46ab7a50fc7677b5f9dac6163f8e6e0bb6174cc6b94cb6ac7a1b59a8ab Mar 11 09:13:53 crc kubenswrapper[4830]: W0311 09:13:53.441054 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7b296ac4586f8274a11cbac8f9a6f12f9cb9573b738846fb4fafe39e3fd343cd WatchSource:0}: Error finding container 7b296ac4586f8274a11cbac8f9a6f12f9cb9573b738846fb4fafe39e3fd343cd: Status 404 returned error can't find the container with id 7b296ac4586f8274a11cbac8f9a6f12f9cb9573b738846fb4fafe39e3fd343cd Mar 11 09:13:53 crc kubenswrapper[4830]: W0311 09:13:53.443790 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-41b25a3579d69317659534e914c0ec612fb89a157bc52c212b83abbe2b95b436 WatchSource:0}: Error finding container 41b25a3579d69317659534e914c0ec612fb89a157bc52c212b83abbe2b95b436: Status 404 returned error can't find the container with id 41b25a3579d69317659534e914c0ec612fb89a157bc52c212b83abbe2b95b436 Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.481319 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="800ms" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.716991 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.718456 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.718508 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.718519 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.718549 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.719092 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.881062 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.942005 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7b296ac4586f8274a11cbac8f9a6f12f9cb9573b738846fb4fafe39e3fd343cd"} Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.943471 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17d41d46ab7a50fc7677b5f9dac6163f8e6e0bb6174cc6b94cb6ac7a1b59a8ab"} Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.944710 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9669fabc66a612450b7db5fb740200012ea3893a710be1713e895c42208fcc00"} Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.945790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"36303b84300938df9c26069aad6444097d7ab100bf318e6d8c0b78c1110b58a8"} Mar 11 09:13:53 crc kubenswrapper[4830]: I0311 09:13:53.946793 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"41b25a3579d69317659534e914c0ec612fb89a157bc52c212b83abbe2b95b436"} Mar 11 09:13:53 crc kubenswrapper[4830]: W0311 09:13:53.983979 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.984579 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:53 crc kubenswrapper[4830]: W0311 09:13:53.997645 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:53 crc kubenswrapper[4830]: E0311 09:13:53.997733 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:54 crc kubenswrapper[4830]: W0311 09:13:54.169415 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:54 crc kubenswrapper[4830]: E0311 09:13:54.169554 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:54 crc kubenswrapper[4830]: W0311 09:13:54.212244 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:54 crc kubenswrapper[4830]: E0311 09:13:54.212339 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:54 crc kubenswrapper[4830]: E0311 09:13:54.282581 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="1.6s" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.519464 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.521338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.521392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.521406 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.521442 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:13:54 crc kubenswrapper[4830]: E0311 09:13:54.521993 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.784310 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 09:13:54 crc kubenswrapper[4830]: E0311 09:13:54.785598 4830 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.880688 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.956758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.956813 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.956850 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.956873 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"295bff38a4f50816d361fe6f33a66ff7cf2a597ba595601abea24c887dce296e"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.956893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.958338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.958391 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.958412 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.960492 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f" exitCode=0 Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.960580 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.960710 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.962671 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.962758 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.962778 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.963148 4830 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3" exitCode=0 Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.963261 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.963207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.964422 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.964453 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.964464 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.965805 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.966957 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.967048 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.967077 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.973455 4830 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8" exitCode=0 Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.973736 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.973786 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.976208 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.976257 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.976272 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.981468 4830 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf" exitCode=0 Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.981566 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf"} Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.981607 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.983069 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.983113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:54 crc kubenswrapper[4830]: I0311 09:13:54.983131 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:55 crc kubenswrapper[4830]: W0311 09:13:55.759373 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:55 crc kubenswrapper[4830]: E0311 09:13:55.759464 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.881717 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:55 crc kubenswrapper[4830]: E0311 09:13:55.883729 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="3.2s" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.986477 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f"} Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.986644 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135"} Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.986660 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e"} Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.988784 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d"} Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.988870 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.989956 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.989998 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.990010 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.997162 4830 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb" exitCode=0 Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.997246 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb"} Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.997438 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.999090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.999133 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:55 crc kubenswrapper[4830]: I0311 09:13:55.999154 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.001563 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.001566 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6"} Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.001605 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88"} Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.001619 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e"} Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.001694 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.004085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.004116 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.004129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.004430 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.004508 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.004521 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.122356 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.123411 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.123533 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.123560 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.123582 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:13:56 crc kubenswrapper[4830]: E0311 09:13:56.123963 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Mar 11 09:13:56 crc kubenswrapper[4830]: W0311 09:13:56.202594 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:56 crc kubenswrapper[4830]: E0311 09:13:56.202692 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:56 crc kubenswrapper[4830]: I0311 09:13:56.460042 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:13:56 crc kubenswrapper[4830]: W0311 09:13:56.756861 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:13:56 crc kubenswrapper[4830]: E0311 09:13:56.756988 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.006621 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bcd88a3b40dd7e68b135a62905a026143e3c999ecaf8c4adf192c68e4ea725f6"} Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.006682 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57"} Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.006717 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.007737 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.008117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.008129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.009504 4830 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594" exitCode=0 Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.009587 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.009613 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594"} Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.009647 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.009710 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010263 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010289 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010528 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010556 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010589 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010619 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:57 crc kubenswrapper[4830]: I0311 09:13:57.010629 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.014985 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f"} Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015042 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1"} Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015055 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794"} Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618"} Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7"} Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015093 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015139 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015183 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.015194 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016530 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016599 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016746 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016782 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016783 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016837 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.016795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.513153 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.823970 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.966066 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.966323 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.968280 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.968343 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.968360 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:58 crc kubenswrapper[4830]: I0311 09:13:58.975712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.018297 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.018349 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.018409 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.020866 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.020923 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.020945 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.023171 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.023224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.023239 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.024292 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.024316 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.024326 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.080082 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.300775 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.324367 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.325932 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.325970 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.325983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:13:59 crc kubenswrapper[4830]: I0311 09:13:59.326008 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.021222 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.022388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.022471 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.022506 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.740090 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.740272 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.740339 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.741925 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.741991 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:00 crc kubenswrapper[4830]: I0311 09:14:00.742010 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.023367 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.024545 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.024582 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.024592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.877141 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.877385 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.878841 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.878916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:01 crc kubenswrapper[4830]: I0311 09:14:01.878941 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.099362 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.099560 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.100767 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.100798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.100808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.339047 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.339334 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.340776 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.340819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:02 crc kubenswrapper[4830]: I0311 09:14:02.340837 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:03 crc kubenswrapper[4830]: E0311 09:14:03.018727 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:14:03 crc kubenswrapper[4830]: I0311 09:14:03.740631 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:14:03 crc kubenswrapper[4830]: I0311 09:14:03.740744 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:14:04 crc kubenswrapper[4830]: I0311 09:14:04.578480 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:04 crc kubenswrapper[4830]: I0311 09:14:04.578714 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:04 crc kubenswrapper[4830]: I0311 09:14:04.580406 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:04 crc kubenswrapper[4830]: I0311 09:14:04.580469 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:04 crc kubenswrapper[4830]: I0311 09:14:04.580485 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:04 crc kubenswrapper[4830]: I0311 09:14:04.585220 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:05 crc kubenswrapper[4830]: I0311 09:14:05.031178 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:05 crc kubenswrapper[4830]: I0311 09:14:05.032100 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:05 crc kubenswrapper[4830]: I0311 09:14:05.032172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:05 crc kubenswrapper[4830]: I0311 09:14:05.032201 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:06 crc kubenswrapper[4830]: I0311 09:14:06.881649 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 11 09:14:07 crc kubenswrapper[4830]: W0311 09:14:07.041125 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.041415 4830 trace.go:236] Trace[167324703]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (11-Mar-2026 09:13:57.039) (total time: 10001ms): Mar 11 09:14:07 crc kubenswrapper[4830]: Trace[167324703]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:14:07.041) Mar 11 09:14:07 crc kubenswrapper[4830]: Trace[167324703]: [10.001612645s] [10.001612645s] END Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.041545 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.222227 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46788->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.222532 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46788->192.168.126.11:17697: read: connection reset by peer" Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.697927 4830 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:07 crc kubenswrapper[4830]: W0311 09:14:07.699741 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.699807 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:07 crc kubenswrapper[4830]: W0311 09:14:07.704761 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.704894 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.706748 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bbe96b9744e7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,LastTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.710365 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 09:14:07 crc kubenswrapper[4830]: W0311 09:14:07.711493 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.711624 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:07 crc kubenswrapper[4830]: E0311 09:14:07.711862 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.715824 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.715894 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.724695 4830 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]log ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]etcd ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/generic-apiserver-start-informers ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/priority-and-fairness-filter ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-apiextensions-informers ok Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-system-namespaces-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/start-kube-aggregator-informers ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 11 09:14:07 crc kubenswrapper[4830]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 11 09:14:07 crc kubenswrapper[4830]: [-]autoregister-completion failed: reason withheld Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/apiservice-openapi-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 11 09:14:07 crc kubenswrapper[4830]: livez check failed Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.724784 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:14:07 crc kubenswrapper[4830]: I0311 09:14:07.883985 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:07Z is after 2026-02-23T05:33:13Z Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.041438 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.044044 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bcd88a3b40dd7e68b135a62905a026143e3c999ecaf8c4adf192c68e4ea725f6" exitCode=255 Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.044087 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bcd88a3b40dd7e68b135a62905a026143e3c999ecaf8c4adf192c68e4ea725f6"} Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.044356 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.045590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.045644 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.045656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.046214 4830 scope.go:117] "RemoveContainer" containerID="bcd88a3b40dd7e68b135a62905a026143e3c999ecaf8c4adf192c68e4ea725f6" Mar 11 09:14:08 crc kubenswrapper[4830]: I0311 09:14:08.882798 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:08Z is after 2026-02-23T05:33:13Z Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.048722 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.049483 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.051383 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" exitCode=255 Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.051428 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2"} Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.051482 4830 scope.go:117] "RemoveContainer" containerID="bcd88a3b40dd7e68b135a62905a026143e3c999ecaf8c4adf192c68e4ea725f6" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.051643 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.052418 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.052444 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.052456 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.052998 4830 scope.go:117] "RemoveContainer" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" Mar 11 09:14:09 crc kubenswrapper[4830]: E0311 09:14:09.053235 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.304726 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:14:09 crc kubenswrapper[4830]: I0311 09:14:09.885780 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:09Z is after 2026-02-23T05:33:13Z Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.055291 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.057515 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.058830 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.058877 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.058887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.059439 4830 scope.go:117] "RemoveContainer" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" Mar 11 09:14:10 crc kubenswrapper[4830]: E0311 09:14:10.059609 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.064265 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:14:10 crc kubenswrapper[4830]: I0311 09:14:10.886093 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:10Z is after 2026-02-23T05:33:13Z Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.060424 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.061679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.061742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.061763 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.062678 4830 scope.go:117] "RemoveContainer" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" Mar 11 09:14:11 crc kubenswrapper[4830]: E0311 09:14:11.062968 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.884217 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:11Z is after 2026-02-23T05:33:13Z Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.908535 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.908831 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.910166 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.910223 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.910236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:11 crc kubenswrapper[4830]: I0311 09:14:11.923209 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 11 09:14:12 crc kubenswrapper[4830]: I0311 09:14:12.062927 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:12 crc kubenswrapper[4830]: I0311 09:14:12.064395 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:12 crc kubenswrapper[4830]: I0311 09:14:12.064445 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:12 crc kubenswrapper[4830]: I0311 09:14:12.064463 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:12 crc kubenswrapper[4830]: I0311 09:14:12.885711 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:12Z is after 2026-02-23T05:33:13Z Mar 11 09:14:13 crc kubenswrapper[4830]: E0311 09:14:13.019330 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:14:13 crc kubenswrapper[4830]: W0311 09:14:13.229213 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:13Z is after 2026-02-23T05:33:13Z Mar 11 09:14:13 crc kubenswrapper[4830]: E0311 09:14:13.229325 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:13Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:13 crc kubenswrapper[4830]: I0311 09:14:13.741200 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:14:13 crc kubenswrapper[4830]: I0311 09:14:13.741313 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:14:13 crc kubenswrapper[4830]: I0311 09:14:13.883744 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:13Z is after 2026-02-23T05:33:13Z Mar 11 09:14:14 crc kubenswrapper[4830]: I0311 09:14:14.111148 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:14 crc kubenswrapper[4830]: I0311 09:14:14.113425 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:14 crc kubenswrapper[4830]: I0311 09:14:14.113474 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:14 crc kubenswrapper[4830]: I0311 09:14:14.113487 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:14 crc kubenswrapper[4830]: I0311 09:14:14.113528 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:14 crc kubenswrapper[4830]: E0311 09:14:14.116535 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:14Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 09:14:14 crc kubenswrapper[4830]: E0311 09:14:14.118970 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:14Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 09:14:14 crc kubenswrapper[4830]: I0311 09:14:14.884240 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:14Z is after 2026-02-23T05:33:13Z Mar 11 09:14:15 crc kubenswrapper[4830]: I0311 09:14:15.883913 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:15Z is after 2026-02-23T05:33:13Z Mar 11 09:14:15 crc kubenswrapper[4830]: I0311 09:14:15.954595 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 09:14:15 crc kubenswrapper[4830]: E0311 09:14:15.960158 4830 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:16 crc kubenswrapper[4830]: I0311 09:14:16.885247 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:16Z is after 2026-02-23T05:33:13Z Mar 11 09:14:16 crc kubenswrapper[4830]: I0311 09:14:16.936224 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:14:16 crc kubenswrapper[4830]: I0311 09:14:16.936605 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:16 crc kubenswrapper[4830]: I0311 09:14:16.938171 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:16 crc kubenswrapper[4830]: I0311 09:14:16.938224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:16 crc kubenswrapper[4830]: I0311 09:14:16.938244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:16 crc kubenswrapper[4830]: I0311 09:14:16.939146 4830 scope.go:117] "RemoveContainer" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" Mar 11 09:14:16 crc kubenswrapper[4830]: E0311 09:14:16.939431 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:17 crc kubenswrapper[4830]: W0311 09:14:17.131080 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:17Z is after 2026-02-23T05:33:13Z Mar 11 09:14:17 crc kubenswrapper[4830]: E0311 09:14:17.131189 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:17 crc kubenswrapper[4830]: E0311 09:14:17.714964 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:17Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bbe96b9744e7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,LastTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:17 crc kubenswrapper[4830]: I0311 09:14:17.885784 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:17Z is after 2026-02-23T05:33:13Z Mar 11 09:14:18 crc kubenswrapper[4830]: W0311 09:14:18.165379 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:18Z is after 2026-02-23T05:33:13Z Mar 11 09:14:18 crc kubenswrapper[4830]: E0311 09:14:18.165474 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:18 crc kubenswrapper[4830]: I0311 09:14:18.513733 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:14:18 crc kubenswrapper[4830]: I0311 09:14:18.514124 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:18 crc kubenswrapper[4830]: I0311 09:14:18.516300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:18 crc kubenswrapper[4830]: I0311 09:14:18.516375 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:18 crc kubenswrapper[4830]: I0311 09:14:18.516394 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:18 crc kubenswrapper[4830]: I0311 09:14:18.517230 4830 scope.go:117] "RemoveContainer" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" Mar 11 09:14:18 crc kubenswrapper[4830]: E0311 09:14:18.517524 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:18 crc kubenswrapper[4830]: I0311 09:14:18.885583 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:18Z is after 2026-02-23T05:33:13Z Mar 11 09:14:19 crc kubenswrapper[4830]: I0311 09:14:19.885920 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:19Z is after 2026-02-23T05:33:13Z Mar 11 09:14:20 crc kubenswrapper[4830]: W0311 09:14:20.076235 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:20Z is after 2026-02-23T05:33:13Z Mar 11 09:14:20 crc kubenswrapper[4830]: E0311 09:14:20.076519 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:20 crc kubenswrapper[4830]: I0311 09:14:20.885123 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:20Z is after 2026-02-23T05:33:13Z Mar 11 09:14:21 crc kubenswrapper[4830]: E0311 09:14:21.120324 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:21Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 09:14:21 crc kubenswrapper[4830]: I0311 09:14:21.120343 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:21 crc kubenswrapper[4830]: I0311 09:14:21.121714 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:21 crc kubenswrapper[4830]: I0311 09:14:21.121742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:21 crc kubenswrapper[4830]: I0311 09:14:21.121751 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:21 crc kubenswrapper[4830]: I0311 09:14:21.121774 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:21 crc kubenswrapper[4830]: E0311 09:14:21.124267 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:21Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 09:14:21 crc kubenswrapper[4830]: I0311 09:14:21.886441 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:21Z is after 2026-02-23T05:33:13Z Mar 11 09:14:22 crc kubenswrapper[4830]: I0311 09:14:22.884963 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:22Z is after 2026-02-23T05:33:13Z Mar 11 09:14:23 crc kubenswrapper[4830]: E0311 09:14:23.019458 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.741186 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.741424 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.741510 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.741712 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.743347 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.743402 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.743420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.744562 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"295bff38a4f50816d361fe6f33a66ff7cf2a597ba595601abea24c887dce296e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.745081 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://295bff38a4f50816d361fe6f33a66ff7cf2a597ba595601abea24c887dce296e" gracePeriod=30 Mar 11 09:14:23 crc kubenswrapper[4830]: I0311 09:14:23.886060 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:23Z is after 2026-02-23T05:33:13Z Mar 11 09:14:23 crc kubenswrapper[4830]: W0311 09:14:23.913763 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:23Z is after 2026-02-23T05:33:13Z Mar 11 09:14:23 crc kubenswrapper[4830]: E0311 09:14:23.913854 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:24 crc kubenswrapper[4830]: I0311 09:14:24.885546 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:24Z is after 2026-02-23T05:33:13Z Mar 11 09:14:25 crc kubenswrapper[4830]: I0311 09:14:25.114220 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 09:14:25 crc kubenswrapper[4830]: I0311 09:14:25.114857 4830 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="295bff38a4f50816d361fe6f33a66ff7cf2a597ba595601abea24c887dce296e" exitCode=255 Mar 11 09:14:25 crc kubenswrapper[4830]: I0311 09:14:25.114923 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"295bff38a4f50816d361fe6f33a66ff7cf2a597ba595601abea24c887dce296e"} Mar 11 09:14:25 crc kubenswrapper[4830]: I0311 09:14:25.883268 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:25Z is after 2026-02-23T05:33:13Z Mar 11 09:14:26 crc kubenswrapper[4830]: I0311 09:14:26.123562 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 09:14:26 crc kubenswrapper[4830]: I0311 09:14:26.124447 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48"} Mar 11 09:14:26 crc kubenswrapper[4830]: I0311 09:14:26.124665 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:26 crc kubenswrapper[4830]: I0311 09:14:26.126311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:26 crc kubenswrapper[4830]: I0311 09:14:26.126399 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:26 crc kubenswrapper[4830]: I0311 09:14:26.126431 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:26 crc kubenswrapper[4830]: I0311 09:14:26.882715 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:26Z is after 2026-02-23T05:33:13Z Mar 11 09:14:27 crc kubenswrapper[4830]: I0311 09:14:27.126911 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:27 crc kubenswrapper[4830]: I0311 09:14:27.127916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:27 crc kubenswrapper[4830]: I0311 09:14:27.127978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:27 crc kubenswrapper[4830]: I0311 09:14:27.127998 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:27 crc kubenswrapper[4830]: E0311 09:14:27.719882 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:27Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bbe96b9744e7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,LastTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:27 crc kubenswrapper[4830]: I0311 09:14:27.885823 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:27Z is after 2026-02-23T05:33:13Z Mar 11 09:14:28 crc kubenswrapper[4830]: I0311 09:14:28.124856 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:28 crc kubenswrapper[4830]: E0311 09:14:28.125829 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:28Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 09:14:28 crc kubenswrapper[4830]: I0311 09:14:28.126125 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:28 crc kubenswrapper[4830]: I0311 09:14:28.126167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:28 crc kubenswrapper[4830]: I0311 09:14:28.126180 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:28 crc kubenswrapper[4830]: I0311 09:14:28.126209 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:28 crc kubenswrapper[4830]: E0311 09:14:28.131179 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 09:14:28 crc kubenswrapper[4830]: I0311 09:14:28.886391 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:28Z is after 2026-02-23T05:33:13Z Mar 11 09:14:29 crc kubenswrapper[4830]: I0311 09:14:29.885746 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:29Z is after 2026-02-23T05:33:13Z Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.740916 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.741238 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.743155 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.743218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.743236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.884918 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:30Z is after 2026-02-23T05:33:13Z Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.931739 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.933061 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.933108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.933121 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:30 crc kubenswrapper[4830]: I0311 09:14:30.933720 4830 scope.go:117] "RemoveContainer" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" Mar 11 09:14:31 crc kubenswrapper[4830]: I0311 09:14:31.138857 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 09:14:31 crc kubenswrapper[4830]: I0311 09:14:31.883972 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:31Z is after 2026-02-23T05:33:13Z Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.146005 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.146551 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.148807 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82" exitCode=255 Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.148862 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82"} Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.148907 4830 scope.go:117] "RemoveContainer" containerID="70c1f2a1c58dca72e8a6873ac8c29314a8d31ee6f36ee98f9f06826b51e4c3f2" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.149129 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.150389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.150429 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.150455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.151378 4830 scope.go:117] "RemoveContainer" containerID="b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82" Mar 11 09:14:32 crc kubenswrapper[4830]: E0311 09:14:32.151673 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.209742 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 09:14:32 crc kubenswrapper[4830]: E0311 09:14:32.215351 4830 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:32 crc kubenswrapper[4830]: E0311 09:14:32.216576 4830 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 11 09:14:32 crc kubenswrapper[4830]: I0311 09:14:32.885491 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:32Z is after 2026-02-23T05:33:13Z Mar 11 09:14:33 crc kubenswrapper[4830]: E0311 09:14:33.020181 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:14:33 crc kubenswrapper[4830]: I0311 09:14:33.154083 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 09:14:33 crc kubenswrapper[4830]: I0311 09:14:33.741133 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:14:33 crc kubenswrapper[4830]: I0311 09:14:33.741224 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:14:33 crc kubenswrapper[4830]: I0311 09:14:33.886000 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:33Z is after 2026-02-23T05:33:13Z Mar 11 09:14:34 crc kubenswrapper[4830]: I0311 09:14:34.578917 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:34 crc kubenswrapper[4830]: I0311 09:14:34.579213 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:34 crc kubenswrapper[4830]: I0311 09:14:34.580843 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:34 crc kubenswrapper[4830]: I0311 09:14:34.580935 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:34 crc kubenswrapper[4830]: I0311 09:14:34.580957 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:34 crc kubenswrapper[4830]: I0311 09:14:34.883681 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:34Z is after 2026-02-23T05:33:13Z Mar 11 09:14:35 crc kubenswrapper[4830]: I0311 09:14:35.131722 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:35 crc kubenswrapper[4830]: E0311 09:14:35.131802 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 09:14:35 crc kubenswrapper[4830]: I0311 09:14:35.133303 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:35 crc kubenswrapper[4830]: I0311 09:14:35.133388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:35 crc kubenswrapper[4830]: I0311 09:14:35.133420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:35 crc kubenswrapper[4830]: I0311 09:14:35.133468 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:35 crc kubenswrapper[4830]: E0311 09:14:35.138157 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 09:14:35 crc kubenswrapper[4830]: I0311 09:14:35.883783 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:35Z is after 2026-02-23T05:33:13Z Mar 11 09:14:35 crc kubenswrapper[4830]: W0311 09:14:35.899181 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:35Z is after 2026-02-23T05:33:13Z Mar 11 09:14:35 crc kubenswrapper[4830]: E0311 09:14:35.899250 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:36 crc kubenswrapper[4830]: I0311 09:14:36.884217 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:36Z is after 2026-02-23T05:33:13Z Mar 11 09:14:36 crc kubenswrapper[4830]: I0311 09:14:36.935247 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:14:36 crc kubenswrapper[4830]: I0311 09:14:36.935463 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:36 crc kubenswrapper[4830]: I0311 09:14:36.936619 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:36 crc kubenswrapper[4830]: I0311 09:14:36.936736 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:36 crc kubenswrapper[4830]: I0311 09:14:36.936767 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:36 crc kubenswrapper[4830]: I0311 09:14:36.937604 4830 scope.go:117] "RemoveContainer" containerID="b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82" Mar 11 09:14:36 crc kubenswrapper[4830]: E0311 09:14:36.937884 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:37 crc kubenswrapper[4830]: E0311 09:14:37.724317 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:37Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bbe96b9744e7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,LastTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:37 crc kubenswrapper[4830]: I0311 09:14:37.886750 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:37Z is after 2026-02-23T05:33:13Z Mar 11 09:14:38 crc kubenswrapper[4830]: I0311 09:14:38.513889 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:14:38 crc kubenswrapper[4830]: I0311 09:14:38.514085 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:38 crc kubenswrapper[4830]: I0311 09:14:38.515210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:38 crc kubenswrapper[4830]: I0311 09:14:38.515258 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:38 crc kubenswrapper[4830]: I0311 09:14:38.515275 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:38 crc kubenswrapper[4830]: I0311 09:14:38.515866 4830 scope.go:117] "RemoveContainer" containerID="b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82" Mar 11 09:14:38 crc kubenswrapper[4830]: E0311 09:14:38.516087 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:38 crc kubenswrapper[4830]: I0311 09:14:38.883785 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:38Z is after 2026-02-23T05:33:13Z Mar 11 09:14:39 crc kubenswrapper[4830]: W0311 09:14:39.405073 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:39Z is after 2026-02-23T05:33:13Z Mar 11 09:14:39 crc kubenswrapper[4830]: E0311 09:14:39.405166 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:39 crc kubenswrapper[4830]: I0311 09:14:39.888961 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:39Z is after 2026-02-23T05:33:13Z Mar 11 09:14:40 crc kubenswrapper[4830]: W0311 09:14:40.508138 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:40Z is after 2026-02-23T05:33:13Z Mar 11 09:14:40 crc kubenswrapper[4830]: E0311 09:14:40.508235 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:40 crc kubenswrapper[4830]: I0311 09:14:40.886308 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:40Z is after 2026-02-23T05:33:13Z Mar 11 09:14:41 crc kubenswrapper[4830]: I0311 09:14:41.885237 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:41Z is after 2026-02-23T05:33:13Z Mar 11 09:14:42 crc kubenswrapper[4830]: E0311 09:14:42.136999 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 09:14:42 crc kubenswrapper[4830]: I0311 09:14:42.139300 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:42 crc kubenswrapper[4830]: I0311 09:14:42.141117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:42 crc kubenswrapper[4830]: I0311 09:14:42.141179 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:42 crc kubenswrapper[4830]: I0311 09:14:42.141202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:42 crc kubenswrapper[4830]: I0311 09:14:42.141242 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:42 crc kubenswrapper[4830]: E0311 09:14:42.145965 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 09:14:42 crc kubenswrapper[4830]: I0311 09:14:42.885551 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:42Z is after 2026-02-23T05:33:13Z Mar 11 09:14:43 crc kubenswrapper[4830]: E0311 09:14:43.020649 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:14:43 crc kubenswrapper[4830]: I0311 09:14:43.740995 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:14:43 crc kubenswrapper[4830]: I0311 09:14:43.741132 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:14:43 crc kubenswrapper[4830]: I0311 09:14:43.885636 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:43Z is after 2026-02-23T05:33:13Z Mar 11 09:14:44 crc kubenswrapper[4830]: I0311 09:14:44.886080 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:44Z is after 2026-02-23T05:33:13Z Mar 11 09:14:45 crc kubenswrapper[4830]: I0311 09:14:45.885003 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:45Z is after 2026-02-23T05:33:13Z Mar 11 09:14:46 crc kubenswrapper[4830]: I0311 09:14:46.467335 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 11 09:14:46 crc kubenswrapper[4830]: I0311 09:14:46.467563 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:46 crc kubenswrapper[4830]: I0311 09:14:46.469037 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:46 crc kubenswrapper[4830]: I0311 09:14:46.469082 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:46 crc kubenswrapper[4830]: I0311 09:14:46.469092 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:46 crc kubenswrapper[4830]: I0311 09:14:46.885554 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:46Z is after 2026-02-23T05:33:13Z Mar 11 09:14:47 crc kubenswrapper[4830]: E0311 09:14:47.728890 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:47Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189bbe96b9744e7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,LastTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:47 crc kubenswrapper[4830]: I0311 09:14:47.886448 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:47Z is after 2026-02-23T05:33:13Z Mar 11 09:14:48 crc kubenswrapper[4830]: I0311 09:14:48.887715 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:48Z is after 2026-02-23T05:33:13Z Mar 11 09:14:49 crc kubenswrapper[4830]: E0311 09:14:49.142963 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:49Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 11 09:14:49 crc kubenswrapper[4830]: I0311 09:14:49.146099 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:49 crc kubenswrapper[4830]: I0311 09:14:49.147796 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:49 crc kubenswrapper[4830]: I0311 09:14:49.148038 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:49 crc kubenswrapper[4830]: I0311 09:14:49.148158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:49 crc kubenswrapper[4830]: I0311 09:14:49.148306 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:49 crc kubenswrapper[4830]: E0311 09:14:49.153340 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 11 09:14:49 crc kubenswrapper[4830]: W0311 09:14:49.160435 4830 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:49Z is after 2026-02-23T05:33:13Z Mar 11 09:14:49 crc kubenswrapper[4830]: E0311 09:14:49.160547 4830 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:14:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 11 09:14:49 crc kubenswrapper[4830]: I0311 09:14:49.887573 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:50 crc kubenswrapper[4830]: I0311 09:14:50.888679 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:51 crc kubenswrapper[4830]: I0311 09:14:51.884000 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:51 crc kubenswrapper[4830]: I0311 09:14:51.931912 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:51 crc kubenswrapper[4830]: I0311 09:14:51.933066 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:51 crc kubenswrapper[4830]: I0311 09:14:51.933126 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:51 crc kubenswrapper[4830]: I0311 09:14:51.933142 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:51 crc kubenswrapper[4830]: I0311 09:14:51.933763 4830 scope.go:117] "RemoveContainer" containerID="b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82" Mar 11 09:14:51 crc kubenswrapper[4830]: E0311 09:14:51.933971 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:14:52 crc kubenswrapper[4830]: I0311 09:14:52.887555 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:53 crc kubenswrapper[4830]: E0311 09:14:53.021012 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.740535 4830 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.740640 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.740725 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.740921 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.742635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.742690 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.742704 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.743408 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.743525 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48" gracePeriod=30 Mar 11 09:14:53 crc kubenswrapper[4830]: I0311 09:14:53.887504 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:54 crc kubenswrapper[4830]: I0311 09:14:54.216562 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 09:14:54 crc kubenswrapper[4830]: I0311 09:14:54.218457 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 11 09:14:54 crc kubenswrapper[4830]: I0311 09:14:54.218909 4830 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48" exitCode=255 Mar 11 09:14:54 crc kubenswrapper[4830]: I0311 09:14:54.218955 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48"} Mar 11 09:14:54 crc kubenswrapper[4830]: I0311 09:14:54.218990 4830 scope.go:117] "RemoveContainer" containerID="295bff38a4f50816d361fe6f33a66ff7cf2a597ba595601abea24c887dce296e" Mar 11 09:14:54 crc kubenswrapper[4830]: I0311 09:14:54.885290 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:55 crc kubenswrapper[4830]: I0311 09:14:55.224110 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 09:14:55 crc kubenswrapper[4830]: I0311 09:14:55.225531 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894"} Mar 11 09:14:55 crc kubenswrapper[4830]: I0311 09:14:55.225775 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:55 crc kubenswrapper[4830]: I0311 09:14:55.227005 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:55 crc kubenswrapper[4830]: I0311 09:14:55.227121 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:55 crc kubenswrapper[4830]: I0311 09:14:55.227146 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:55 crc kubenswrapper[4830]: I0311 09:14:55.886821 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:56 crc kubenswrapper[4830]: E0311 09:14:56.148504 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.153738 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.154944 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.154979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.154988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.155029 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:14:56 crc kubenswrapper[4830]: E0311 09:14:56.161242 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.227942 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.228912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.228942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.228954 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:56 crc kubenswrapper[4830]: I0311 09:14:56.885865 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.737078 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96b9744e7b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,LastTimestamp:2026-03-11 09:13:52.866201211 +0000 UTC m=+0.647351930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.742264 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.748531 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.755526 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e7233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,LastTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.760081 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96c229bd97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:53.012309399 +0000 UTC m=+0.793460108,LastTimestamp:2026-03-11 09:13:53.012309399 +0000 UTC m=+0.793460108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.766572 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e1a63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:53.033367067 +0000 UTC m=+0.814517756,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.771169 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e5023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:53.033389577 +0000 UTC m=+0.814540266,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.777221 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e7233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e7233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,LastTimestamp:2026-03-11 09:13:53.033399687 +0000 UTC m=+0.814550376,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.781477 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e1a63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:53.03476512 +0000 UTC m=+0.815915809,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.786569 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e5023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:53.03478937 +0000 UTC m=+0.815940059,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.790937 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e7233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e7233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,LastTimestamp:2026-03-11 09:13:53.03480077 +0000 UTC m=+0.815951459,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.795532 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e1a63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:53.035740371 +0000 UTC m=+0.816891080,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.799585 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e5023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:53.035801261 +0000 UTC m=+0.816951970,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.803510 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e7233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e7233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,LastTimestamp:2026-03-11 09:13:53.035817651 +0000 UTC m=+0.816968360,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.807391 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e1a63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:53.036522113 +0000 UTC m=+0.817672812,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.811834 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e5023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:53.036543743 +0000 UTC m=+0.817694452,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.816235 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e1a63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:53.036558073 +0000 UTC m=+0.817708752,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.819946 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e7233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e7233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,LastTimestamp:2026-03-11 09:13:53.036560043 +0000 UTC m=+0.817710742,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.823492 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e5023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:53.036573113 +0000 UTC m=+0.817723802,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.829147 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e7233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e7233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,LastTimestamp:2026-03-11 09:13:53.036581753 +0000 UTC m=+0.817732442,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.833821 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e1a63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:53.037211744 +0000 UTC m=+0.818362433,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.838739 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e5023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:53.037232934 +0000 UTC m=+0.818383623,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.843685 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e7233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e7233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932926003 +0000 UTC m=+0.714076692,LastTimestamp:2026-03-11 09:13:53.037241934 +0000 UTC m=+0.818392623,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.849066 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e1a63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e1a63 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932903523 +0000 UTC m=+0.714054212,LastTimestamp:2026-03-11 09:13:53.038592446 +0000 UTC m=+0.819743135,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.852662 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189bbe96bd6e5023\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189bbe96bd6e5023 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:52.932917283 +0000 UTC m=+0.714067972,LastTimestamp:2026-03-11 09:13:53.038611036 +0000 UTC m=+0.819761725,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.858230 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe96da7587be openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:53.419929534 +0000 UTC m=+1.201080223,LastTimestamp:2026-03-11 09:13:53.419929534 +0000 UTC m=+1.201080223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.862193 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bbe96db0d02ae openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:53.429856942 +0000 UTC m=+1.211007641,LastTimestamp:2026-03-11 09:13:53.429856942 +0000 UTC m=+1.211007641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.865679 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe96db7ded10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:53.437256976 +0000 UTC m=+1.218407685,LastTimestamp:2026-03-11 09:13:53.437256976 +0000 UTC m=+1.218407685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.869244 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe96dbfd6617 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:53.445611031 +0000 UTC m=+1.226761740,LastTimestamp:2026-03-11 09:13:53.445611031 +0000 UTC m=+1.226761740,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.873140 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe96dc6ddb7a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:53.452981114 +0000 UTC m=+1.234131823,LastTimestamp:2026-03-11 09:13:53.452981114 +0000 UTC m=+1.234131823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.876985 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe97006a61ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.056733099 +0000 UTC m=+1.837883798,LastTimestamp:2026-03-11 09:13:54.056733099 +0000 UTC m=+1.837883798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.882348 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bbe97006aaaa3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.056751779 +0000 UTC m=+1.837902468,LastTimestamp:2026-03-11 09:13:54.056751779 +0000 UTC m=+1.837902468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: I0311 09:14:57.883063 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.885786 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe970070431a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.05711849 +0000 UTC m=+1.838269179,LastTimestamp:2026-03-11 09:13:54.05711849 +0000 UTC m=+1.838269179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.889873 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe970074eca8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.05742404 +0000 UTC m=+1.838574729,LastTimestamp:2026-03-11 09:13:54.05742404 +0000 UTC m=+1.838574729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.895002 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9700c1558d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.062431629 +0000 UTC m=+1.843582328,LastTimestamp:2026-03-11 09:13:54.062431629 +0000 UTC m=+1.843582328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.899434 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bbe9701258c71 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.068999281 +0000 UTC m=+1.850149970,LastTimestamp:2026-03-11 09:13:54.068999281 +0000 UTC m=+1.850149970,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.920956 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe97013c5133 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.070491443 +0000 UTC m=+1.851642132,LastTimestamp:2026-03-11 09:13:54.070491443 +0000 UTC m=+1.851642132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.926192 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe9701402ee5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.070744805 +0000 UTC m=+1.851895494,LastTimestamp:2026-03-11 09:13:54.070744805 +0000 UTC m=+1.851895494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.930943 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9701549639 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.072081977 +0000 UTC m=+1.853232656,LastTimestamp:2026-03-11 09:13:54.072081977 +0000 UTC m=+1.853232656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: I0311 09:14:57.931566 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:14:57 crc kubenswrapper[4830]: I0311 09:14:57.932673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:14:57 crc kubenswrapper[4830]: I0311 09:14:57.932721 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:14:57 crc kubenswrapper[4830]: I0311 09:14:57.932732 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.934964 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97017836a1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.074416801 +0000 UTC m=+1.855567490,LastTimestamp:2026-03-11 09:13:54.074416801 +0000 UTC m=+1.855567490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.939277 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9701db68a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.080917673 +0000 UTC m=+1.862068362,LastTimestamp:2026-03-11 09:13:54.080917673 +0000 UTC m=+1.862068362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.943778 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe97132302ed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.370822893 +0000 UTC m=+2.151973582,LastTimestamp:2026-03-11 09:13:54.370822893 +0000 UTC m=+2.151973582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.948398 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9713d1245a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.382234714 +0000 UTC m=+2.163385443,LastTimestamp:2026-03-11 09:13:54.382234714 +0000 UTC m=+2.163385443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.952550 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9713eaf0ef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.383925487 +0000 UTC m=+2.165076216,LastTimestamp:2026-03-11 09:13:54.383925487 +0000 UTC m=+2.165076216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.956390 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe97228aea70 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.629290608 +0000 UTC m=+2.410441307,LastTimestamp:2026-03-11 09:13:54.629290608 +0000 UTC m=+2.410441307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.959978 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe972386a5f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.645788149 +0000 UTC m=+2.426938838,LastTimestamp:2026-03-11 09:13:54.645788149 +0000 UTC m=+2.426938838,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.963795 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9723a6a3e1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.647884769 +0000 UTC m=+2.429035498,LastTimestamp:2026-03-11 09:13:54.647884769 +0000 UTC m=+2.429035498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.968746 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9731407b09 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.876070665 +0000 UTC m=+2.657221354,LastTimestamp:2026-03-11 09:13:54.876070665 +0000 UTC m=+2.657221354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.974126 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe97325e8e70 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.894818928 +0000 UTC m=+2.675969617,LastTimestamp:2026-03-11 09:13:54.894818928 +0000 UTC m=+2.675969617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.978972 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe97369540a6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.965512358 +0000 UTC m=+2.746663107,LastTimestamp:2026-03-11 09:13:54.965512358 +0000 UTC m=+2.746663107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.983882 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bbe97369923b4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.965767092 +0000 UTC m=+2.746917781,LastTimestamp:2026-03-11 09:13:54.965767092 +0000 UTC m=+2.746917781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.988762 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe973754b2de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.978058974 +0000 UTC m=+2.759209703,LastTimestamp:2026-03-11 09:13:54.978058974 +0000 UTC m=+2.759209703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:57 crc kubenswrapper[4830]: E0311 09:14:57.995708 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe9737bf8a4e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.985060942 +0000 UTC m=+2.766211671,LastTimestamp:2026-03-11 09:13:54.985060942 +0000 UTC m=+2.766211671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.001533 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe97575da815 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.515516949 +0000 UTC m=+3.296667668,LastTimestamp:2026-03-11 09:13:55.515516949 +0000 UTC m=+3.296667668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.006713 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bbe975793b379 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.519058809 +0000 UTC m=+3.300209518,LastTimestamp:2026-03-11 09:13:55.519058809 +0000 UTC m=+3.300209518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.011194 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe975793b3ab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.519058859 +0000 UTC m=+3.300209568,LastTimestamp:2026-03-11 09:13:55.519058859 +0000 UTC m=+3.300209568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.016957 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97579513c5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.519148997 +0000 UTC m=+3.300299696,LastTimestamp:2026-03-11 09:13:55.519148997 +0000 UTC m=+3.300299696,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.022887 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe97582defef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.529166831 +0000 UTC m=+3.310317530,LastTimestamp:2026-03-11 09:13:55.529166831 +0000 UTC m=+3.310317530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.027763 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe975844cc48 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.530665032 +0000 UTC m=+3.311815721,LastTimestamp:2026-03-11 09:13:55.530665032 +0000 UTC m=+3.311815721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.031733 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe97586b8a28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.533204008 +0000 UTC m=+3.314354697,LastTimestamp:2026-03-11 09:13:55.533204008 +0000 UTC m=+3.314354697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.039376 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe97587b6b53 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.534244691 +0000 UTC m=+3.315395380,LastTimestamp:2026-03-11 09:13:55.534244691 +0000 UTC m=+3.315395380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.040892 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189bbe975892ebf8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.535784952 +0000 UTC m=+3.316935641,LastTimestamp:2026-03-11 09:13:55.535784952 +0000 UTC m=+3.316935641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.043750 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe9758d43612 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.540063762 +0000 UTC m=+3.321214451,LastTimestamp:2026-03-11 09:13:55.540063762 +0000 UTC m=+3.321214451,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.047774 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9763a5d75b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.721574235 +0000 UTC m=+3.502724924,LastTimestamp:2026-03-11 09:13:55.721574235 +0000 UTC m=+3.502724924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.051672 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe9763a72d39 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.721661753 +0000 UTC m=+3.502812492,LastTimestamp:2026-03-11 09:13:55.721661753 +0000 UTC m=+3.502812492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.055349 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe976472a6a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.734996642 +0000 UTC m=+3.516147331,LastTimestamp:2026-03-11 09:13:55.734996642 +0000 UTC m=+3.516147331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.058916 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe97648091ab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.735908779 +0000 UTC m=+3.517059468,LastTimestamp:2026-03-11 09:13:55.735908779 +0000 UTC m=+3.517059468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.062582 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe9764b4e8c7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.739338951 +0000 UTC m=+3.520489660,LastTimestamp:2026-03-11 09:13:55.739338951 +0000 UTC m=+3.520489660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.066521 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe9764dcedca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.741961674 +0000 UTC m=+3.523112363,LastTimestamp:2026-03-11 09:13:55.741961674 +0000 UTC m=+3.523112363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.070823 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe97718d62e8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.954852584 +0000 UTC m=+3.736003263,LastTimestamp:2026-03-11 09:13:55.954852584 +0000 UTC m=+3.736003263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.075100 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9771b4f3ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.957445548 +0000 UTC m=+3.738596237,LastTimestamp:2026-03-11 09:13:55.957445548 +0000 UTC m=+3.738596237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.079152 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9772c68b2c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.97537566 +0000 UTC m=+3.756526349,LastTimestamp:2026-03-11 09:13:55.97537566 +0000 UTC m=+3.756526349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.082891 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9772dab9eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.976698347 +0000 UTC m=+3.757849036,LastTimestamp:2026-03-11 09:13:55.976698347 +0000 UTC m=+3.757849036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.087048 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189bbe9772dad9f2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:55.976706546 +0000 UTC m=+3.757857235,LastTimestamp:2026-03-11 09:13:55.976706546 +0000 UTC m=+3.757857235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.091945 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe977445e680 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.000499328 +0000 UTC m=+3.781650027,LastTimestamp:2026-03-11 09:13:56.000499328 +0000 UTC m=+3.781650027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.096667 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe977d6b7891 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.153956497 +0000 UTC m=+3.935107196,LastTimestamp:2026-03-11 09:13:56.153956497 +0000 UTC m=+3.935107196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.101223 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe977d8162b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.155392696 +0000 UTC m=+3.936543386,LastTimestamp:2026-03-11 09:13:56.155392696 +0000 UTC m=+3.936543386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.105103 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe977e5ebd25 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.169899301 +0000 UTC m=+3.951049990,LastTimestamp:2026-03-11 09:13:56.169899301 +0000 UTC m=+3.951049990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.109556 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe977e7b17aa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.171757482 +0000 UTC m=+3.952908171,LastTimestamp:2026-03-11 09:13:56.171757482 +0000 UTC m=+3.952908171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.113986 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe977e9910a6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.173721766 +0000 UTC m=+3.954872455,LastTimestamp:2026-03-11 09:13:56.173721766 +0000 UTC m=+3.954872455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.117897 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9787c1ccef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.327386351 +0000 UTC m=+4.108537040,LastTimestamp:2026-03-11 09:13:56.327386351 +0000 UTC m=+4.108537040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.122338 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe97886b4047 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.338491463 +0000 UTC m=+4.119642152,LastTimestamp:2026-03-11 09:13:56.338491463 +0000 UTC m=+4.119642152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.126850 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97b08f7730 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.011953456 +0000 UTC m=+4.793104145,LastTimestamp:2026-03-11 09:13:57.011953456 +0000 UTC m=+4.793104145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.130959 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97bb1b238d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.188879245 +0000 UTC m=+4.970029934,LastTimestamp:2026-03-11 09:13:57.188879245 +0000 UTC m=+4.970029934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.134539 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97bba8f952 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.198174546 +0000 UTC m=+4.979325235,LastTimestamp:2026-03-11 09:13:57.198174546 +0000 UTC m=+4.979325235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.138062 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97bbba5820 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.199312928 +0000 UTC m=+4.980463617,LastTimestamp:2026-03-11 09:13:57.199312928 +0000 UTC m=+4.980463617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.141215 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97c6c72d86 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.384703366 +0000 UTC m=+5.165854055,LastTimestamp:2026-03-11 09:13:57.384703366 +0000 UTC m=+5.165854055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.145371 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97c7a8fede openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.399502558 +0000 UTC m=+5.180653247,LastTimestamp:2026-03-11 09:13:57.399502558 +0000 UTC m=+5.180653247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.148699 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97c7b64d9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.400374682 +0000 UTC m=+5.181525371,LastTimestamp:2026-03-11 09:13:57.400374682 +0000 UTC m=+5.181525371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.152814 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97d258e36f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.578802031 +0000 UTC m=+5.359952730,LastTimestamp:2026-03-11 09:13:57.578802031 +0000 UTC m=+5.359952730,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.156364 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97d2f283b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.588870065 +0000 UTC m=+5.370020774,LastTimestamp:2026-03-11 09:13:57.588870065 +0000 UTC m=+5.370020774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.160762 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97d30323aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.589959594 +0000 UTC m=+5.371110283,LastTimestamp:2026-03-11 09:13:57.589959594 +0000 UTC m=+5.371110283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.165076 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97de890f68 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.783285608 +0000 UTC m=+5.564436297,LastTimestamp:2026-03-11 09:13:57.783285608 +0000 UTC m=+5.564436297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.168635 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97df64a0a1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.797675169 +0000 UTC m=+5.578825868,LastTimestamp:2026-03-11 09:13:57.797675169 +0000 UTC m=+5.578825868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.172196 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97df7e042f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.799339055 +0000 UTC m=+5.580489754,LastTimestamp:2026-03-11 09:13:57.799339055 +0000 UTC m=+5.580489754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.176667 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97eb06506b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:57.992820843 +0000 UTC m=+5.773971532,LastTimestamp:2026-03-11 09:13:57.992820843 +0000 UTC m=+5.773971532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.181348 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189bbe97ebb29292 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:58.00410997 +0000 UTC m=+5.785260659,LastTimestamp:2026-03-11 09:13:58.00410997 +0000 UTC m=+5.785260659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.188044 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-controller-manager-crc.189bbe9941a01704 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 11 09:14:58 crc kubenswrapper[4830]: body: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:03.740706564 +0000 UTC m=+11.521857303,LastTimestamp:2026-03-11 09:14:03.740706564 +0000 UTC m=+11.521857303,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.191917 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9941a15f1c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:03.740790556 +0000 UTC m=+11.521941255,LastTimestamp:2026-03-11 09:14:03.740790556 +0000 UTC m=+11.521941255,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.197952 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-apiserver-crc.189bbe9a1128382c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:46788->192.168.126.11:17697: read: connection reset by peer Mar 11 09:14:58 crc kubenswrapper[4830]: body: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:07.22251166 +0000 UTC m=+15.003662349,LastTimestamp:2026-03-11 09:14:07.22251166 +0000 UTC m=+15.003662349,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.202438 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9a1129c95b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46788->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:07.222614363 +0000 UTC m=+15.003765052,LastTimestamp:2026-03-11 09:14:07.222614363 +0000 UTC m=+15.003765052,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.206895 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-apiserver-crc.189bbe9a2e9062a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 11 09:14:58 crc kubenswrapper[4830]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 11 09:14:58 crc kubenswrapper[4830]: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:07.715877538 +0000 UTC m=+15.497028237,LastTimestamp:2026-03-11 09:14:07.715877538 +0000 UTC m=+15.497028237,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.211160 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9a2e90ff51 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:07.715917649 +0000 UTC m=+15.497068338,LastTimestamp:2026-03-11 09:14:07.715917649 +0000 UTC m=+15.497068338,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.217159 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-apiserver-crc.189bbe9a2f17ecb9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 11 09:14:58 crc kubenswrapper[4830]: body: [+]ping ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]log ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]etcd ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/generic-apiserver-start-informers ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/priority-and-fairness-filter ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-apiextensions-informers ok Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-system-namespaces-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/start-kube-aggregator-informers ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 11 09:14:58 crc kubenswrapper[4830]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 11 09:14:58 crc kubenswrapper[4830]: [-]autoregister-completion failed: reason withheld Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/apiservice-openapi-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 11 09:14:58 crc kubenswrapper[4830]: livez check failed Mar 11 09:14:58 crc kubenswrapper[4830]: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:07.724760249 +0000 UTC m=+15.505910958,LastTimestamp:2026-03-11 09:14:07.724760249 +0000 UTC m=+15.505910958,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.224069 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe9a2f18de05 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:07.724822021 +0000 UTC m=+15.505972720,LastTimestamp:2026-03-11 09:14:07.724822021 +0000 UTC m=+15.505972720,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.229590 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189bbe977e7b17aa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189bbe977e7b17aa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:56.171757482 +0000 UTC m=+3.952908171,LastTimestamp:2026-03-11 09:14:08.047223863 +0000 UTC m=+15.828374552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.235628 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-controller-manager-crc.189bbe9b95b4d290 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 09:14:58 crc kubenswrapper[4830]: body: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:13.741286032 +0000 UTC m=+21.522436751,LastTimestamp:2026-03-11 09:14:13.741286032 +0000 UTC m=+21.522436751,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.239832 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9b95b5eaa5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:13.741357733 +0000 UTC m=+21.522508462,LastTimestamp:2026-03-11 09:14:13.741357733 +0000 UTC m=+21.522508462,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.245304 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe9b95b4d290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-controller-manager-crc.189bbe9b95b4d290 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 09:14:58 crc kubenswrapper[4830]: body: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:13.741286032 +0000 UTC m=+21.522436751,LastTimestamp:2026-03-11 09:14:23.741336994 +0000 UTC m=+31.522487773,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.250234 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe9b95b5eaa5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9b95b5eaa5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:13.741357733 +0000 UTC m=+21.522508462,LastTimestamp:2026-03-11 09:14:23.741469347 +0000 UTC m=+31.522620066,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.251478 4830 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9de9f904aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:23.744976042 +0000 UTC m=+31.526126801,LastTimestamp:2026-03-11 09:14:23.744976042 +0000 UTC m=+31.526126801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.257051 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe9701549639\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9701549639 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.072081977 +0000 UTC m=+1.853232656,LastTimestamp:2026-03-11 09:14:25.069654823 +0000 UTC m=+32.850805552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.261665 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe97132302ed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe97132302ed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.370822893 +0000 UTC m=+2.151973582,LastTimestamp:2026-03-11 09:14:25.401161382 +0000 UTC m=+33.182312111,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.266873 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe9713d1245a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9713d1245a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:13:54.382234714 +0000 UTC m=+2.163385443,LastTimestamp:2026-03-11 09:14:25.530039635 +0000 UTC m=+33.311190334,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.274830 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe9b95b4d290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-controller-manager-crc.189bbe9b95b4d290 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 09:14:58 crc kubenswrapper[4830]: body: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:13.741286032 +0000 UTC m=+21.522436751,LastTimestamp:2026-03-11 09:14:33.741203241 +0000 UTC m=+41.522353930,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.279873 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe9b95b5eaa5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189bbe9b95b5eaa5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:13.741357733 +0000 UTC m=+21.522508462,LastTimestamp:2026-03-11 09:14:33.741247182 +0000 UTC m=+41.522397871,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:14:58 crc kubenswrapper[4830]: E0311 09:14:58.285185 4830 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189bbe9b95b4d290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 11 09:14:58 crc kubenswrapper[4830]: &Event{ObjectMeta:{kube-controller-manager-crc.189bbe9b95b4d290 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 11 09:14:58 crc kubenswrapper[4830]: body: Mar 11 09:14:58 crc kubenswrapper[4830]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:14:13.741286032 +0000 UTC m=+21.522436751,LastTimestamp:2026-03-11 09:14:43.741102619 +0000 UTC m=+51.522253348,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 11 09:14:58 crc kubenswrapper[4830]: > Mar 11 09:14:58 crc kubenswrapper[4830]: I0311 09:14:58.889918 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:14:59 crc kubenswrapper[4830]: I0311 09:14:59.886166 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:00 crc kubenswrapper[4830]: I0311 09:15:00.740729 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:15:00 crc kubenswrapper[4830]: I0311 09:15:00.740995 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:00 crc kubenswrapper[4830]: I0311 09:15:00.742462 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:00 crc kubenswrapper[4830]: I0311 09:15:00.742500 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:00 crc kubenswrapper[4830]: I0311 09:15:00.742508 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:00 crc kubenswrapper[4830]: I0311 09:15:00.748909 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:15:00 crc kubenswrapper[4830]: I0311 09:15:00.884304 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:01 crc kubenswrapper[4830]: I0311 09:15:01.240987 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:01 crc kubenswrapper[4830]: I0311 09:15:01.241092 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:15:01 crc kubenswrapper[4830]: I0311 09:15:01.241794 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:01 crc kubenswrapper[4830]: I0311 09:15:01.241834 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:01 crc kubenswrapper[4830]: I0311 09:15:01.241850 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:01 crc kubenswrapper[4830]: I0311 09:15:01.886066 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.242846 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.244438 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.244480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.244491 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.883787 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.932160 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.933754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.933788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.933800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:02 crc kubenswrapper[4830]: I0311 09:15:02.934369 4830 scope.go:117] "RemoveContainer" containerID="b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82" Mar 11 09:15:03 crc kubenswrapper[4830]: E0311 09:15:03.022099 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:15:03 crc kubenswrapper[4830]: E0311 09:15:03.153183 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 11 09:15:03 crc kubenswrapper[4830]: I0311 09:15:03.161349 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:03 crc kubenswrapper[4830]: I0311 09:15:03.162479 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:03 crc kubenswrapper[4830]: I0311 09:15:03.162523 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:03 crc kubenswrapper[4830]: I0311 09:15:03.162535 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:03 crc kubenswrapper[4830]: I0311 09:15:03.162562 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:15:03 crc kubenswrapper[4830]: E0311 09:15:03.168225 4830 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 11 09:15:03 crc kubenswrapper[4830]: I0311 09:15:03.883916 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.217793 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.238347 4830 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.261595 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.262308 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.264540 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" exitCode=255 Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.264589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4"} Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.264631 4830 scope.go:117] "RemoveContainer" containerID="b6cc880c44e8e93bfe9c47b4b29beb1ca8b896e5172bfac6186b6cdaf58a3f82" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.264849 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.266178 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.266224 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.266242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.267070 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:15:04 crc kubenswrapper[4830]: E0311 09:15:04.267319 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.581564 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.581733 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.583064 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.583099 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.583110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:04 crc kubenswrapper[4830]: I0311 09:15:04.884971 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:05 crc kubenswrapper[4830]: I0311 09:15:05.267929 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 09:15:05 crc kubenswrapper[4830]: I0311 09:15:05.885714 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:06 crc kubenswrapper[4830]: I0311 09:15:06.885799 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:06 crc kubenswrapper[4830]: I0311 09:15:06.935577 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:15:06 crc kubenswrapper[4830]: I0311 09:15:06.935707 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:06 crc kubenswrapper[4830]: I0311 09:15:06.936602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:06 crc kubenswrapper[4830]: I0311 09:15:06.936628 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:06 crc kubenswrapper[4830]: I0311 09:15:06.936636 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:06 crc kubenswrapper[4830]: I0311 09:15:06.937065 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:15:06 crc kubenswrapper[4830]: E0311 09:15:06.937215 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:15:07 crc kubenswrapper[4830]: I0311 09:15:07.885347 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:08 crc kubenswrapper[4830]: I0311 09:15:08.513865 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:15:08 crc kubenswrapper[4830]: I0311 09:15:08.514175 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:08 crc kubenswrapper[4830]: I0311 09:15:08.515880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:08 crc kubenswrapper[4830]: I0311 09:15:08.515951 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:08 crc kubenswrapper[4830]: I0311 09:15:08.515965 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:08 crc kubenswrapper[4830]: I0311 09:15:08.516818 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:15:08 crc kubenswrapper[4830]: E0311 09:15:08.518062 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:15:08 crc kubenswrapper[4830]: I0311 09:15:08.886601 4830 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 11 09:15:09 crc kubenswrapper[4830]: I0311 09:15:09.400183 4830 csr.go:261] certificate signing request csr-ww9zx is approved, waiting to be issued Mar 11 09:15:09 crc kubenswrapper[4830]: I0311 09:15:09.409351 4830 csr.go:257] certificate signing request csr-ww9zx is issued Mar 11 09:15:09 crc kubenswrapper[4830]: I0311 09:15:09.486345 4830 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 11 09:15:09 crc kubenswrapper[4830]: I0311 09:15:09.706821 4830 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.169264 4830 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.171114 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.171165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.171191 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.171497 4830 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.182862 4830 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.183338 4830 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.183377 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.190249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.190308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.190335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.190368 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.190394 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:10Z","lastTransitionTime":"2026-03-11T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.211760 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.228223 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.228302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.228331 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.228370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.228404 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:10Z","lastTransitionTime":"2026-03-11T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.246767 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.259437 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.259500 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.259520 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.259548 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.259568 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:10Z","lastTransitionTime":"2026-03-11T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.278123 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.288005 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.288103 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.288123 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.288152 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.288172 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:10Z","lastTransitionTime":"2026-03-11T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.310009 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.310532 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.310627 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.410796 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.410869 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 04:04:57.579444329 +0000 UTC Mar 11 09:15:10 crc kubenswrapper[4830]: I0311 09:15:10.411329 4830 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7074h49m47.168128482s for next certificate rotation Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.511546 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.612249 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.713366 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.814417 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:10 crc kubenswrapper[4830]: E0311 09:15:10.914983 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.015674 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.116095 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.216942 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.317635 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.417851 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.518782 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.619977 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.720762 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.821490 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:11 crc kubenswrapper[4830]: E0311 09:15:11.922049 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.022545 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.122672 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.223120 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.323647 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.423782 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.524619 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.625338 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.726486 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.826974 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:12 crc kubenswrapper[4830]: E0311 09:15:12.927823 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.022735 4830 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.027919 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.129101 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.229308 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.330116 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.431479 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.532402 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.633337 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.734115 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.835305 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:13 crc kubenswrapper[4830]: E0311 09:15:13.936248 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.037161 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.137305 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.238439 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.338538 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.439716 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.540527 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.641659 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.742871 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.844103 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:14 crc kubenswrapper[4830]: E0311 09:15:14.944372 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.045232 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.145653 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.246658 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.347120 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.448140 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.549261 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.650336 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.751476 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.852347 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:15 crc kubenswrapper[4830]: E0311 09:15:15.953227 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.053605 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.153882 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.254122 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.355100 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.455282 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.556121 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.657165 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.757702 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.857976 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:16 crc kubenswrapper[4830]: E0311 09:15:16.958618 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.059650 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.160056 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.260223 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.360552 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.461135 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.562078 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.662943 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.763291 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.864168 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:17 crc kubenswrapper[4830]: E0311 09:15:17.964798 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.065683 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.166657 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.267957 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.368620 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.469488 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.570538 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.671256 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.771473 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.872797 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:18 crc kubenswrapper[4830]: E0311 09:15:18.973754 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.074667 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.174837 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.275993 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.376995 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.478135 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.578931 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.679568 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.779853 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.880850 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:19 crc kubenswrapper[4830]: E0311 09:15:19.982159 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.083175 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.183836 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.284954 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.385904 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.421118 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.426089 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.426122 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.426133 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.426148 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.426159 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.437937 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.447300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.447332 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.447345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.447362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.447374 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.461959 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.473278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.473308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.473322 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.473339 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.473352 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.487030 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.496822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.496850 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.496863 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.496878 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.496891 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.513394 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.513549 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.513578 4830 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.577080 4830 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.616378 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.616412 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.616424 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.616442 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.616454 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.723342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.724215 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.724298 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.724336 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.724357 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.827255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.827302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.827318 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.827342 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.827359 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.901048 4830 apiserver.go:52] "Watching apiserver" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.907643 4830 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.907942 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.908396 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.908628 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.908711 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.908800 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.908887 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.908903 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.909054 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.909121 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:20 crc kubenswrapper[4830]: E0311 09:15:20.909556 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.911118 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.911337 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.911697 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.911819 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.911987 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.912045 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.912195 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.912342 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.912521 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.928997 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.929041 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.929053 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.929069 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.929081 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:20Z","lastTransitionTime":"2026-03-11T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.940543 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.951676 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.961083 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.970188 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.978984 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.979654 4830 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.989036 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:20 crc kubenswrapper[4830]: I0311 09:15:20.998673 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.032413 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.032473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.032489 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.032513 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.032530 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.065968 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066099 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066159 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066214 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066261 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066313 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066365 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066413 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066468 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066523 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066570 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066621 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066668 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066714 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066763 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066811 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066900 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.066956 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067005 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067101 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067154 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067204 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067254 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067308 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067362 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067421 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067473 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067521 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067574 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067630 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067678 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068123 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068177 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068230 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068314 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068366 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068421 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068474 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068528 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068576 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068628 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068678 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068784 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068835 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068885 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068936 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068987 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069078 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069133 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069186 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069235 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069284 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069340 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069391 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069446 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069502 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069555 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069607 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069658 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069715 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069769 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069824 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069876 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069929 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069977 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070068 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070124 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070183 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070238 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070290 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070341 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070403 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070507 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070560 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070612 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070667 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070719 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070774 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070862 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070917 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070974 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071067 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071126 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071182 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071233 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071289 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071343 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071395 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071449 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071501 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071553 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071607 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071723 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071778 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071836 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071890 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071947 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071997 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072094 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072148 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072202 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072260 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072312 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072365 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072420 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072473 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072526 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072579 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072687 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072744 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072797 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072855 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072909 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072963 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073056 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073124 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073185 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073242 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073294 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073350 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073407 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073463 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073516 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073629 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073687 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073748 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073806 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073868 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073926 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073981 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074074 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074142 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074200 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074257 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074313 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074371 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074430 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074485 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074543 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074600 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074657 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074714 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074768 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074984 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075087 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075149 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075211 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075269 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075325 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075387 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075448 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075505 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075698 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075759 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075820 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076066 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076280 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076309 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076335 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076363 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076386 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076408 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076430 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076475 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076498 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076521 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076544 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076589 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076611 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076655 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076676 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076786 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076817 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076842 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076865 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076889 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076912 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076935 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076958 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076983 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077006 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077049 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077072 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077097 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077120 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077164 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077194 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077220 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077249 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077274 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077297 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077321 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077344 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077392 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077416 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077491 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.067951 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068174 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068168 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068264 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068640 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.068644 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069051 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069111 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069155 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069519 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069576 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069740 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077961 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.078052 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.078310 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.078327 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.069929 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070120 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070295 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070286 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070340 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070883 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.078473 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.070985 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071303 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071362 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071612 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071897 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.071943 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072084 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072442 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072543 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072633 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.072831 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073105 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073282 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073416 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073658 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073768 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073936 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.073855 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074125 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074382 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074517 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074624 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.074794 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075071 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075431 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075486 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075640 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.075963 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076005 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076052 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076087 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076313 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076701 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.076817 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077008 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077097 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077370 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077448 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077508 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.077709 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.078767 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.078872 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.079611 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.079816 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.081183 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.082910 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.082949 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.083239 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.083317 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.083308 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.083776 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.083995 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084188 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084244 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084425 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084580 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084684 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084776 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084802 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.084867 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085100 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085367 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.085433 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085526 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085596 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085825 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085882 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.086039 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.086064 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.085911 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.086286 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.086292 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.086549 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.086643 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.086838 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.087049 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.087075 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:21.585535946 +0000 UTC m=+89.366686695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.087105 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.087258 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.087738 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.087738 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.087925 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.087967 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.088635 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.088690 4830 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.088732 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.088980 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.089132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.089337 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.089619 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.089736 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.089847 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.089927 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.090255 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.090509 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.090674 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.090890 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.090890 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.091171 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.091246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.091626 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.091648 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.091769 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:15:21.591744236 +0000 UTC m=+89.372895025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.092210 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.093925 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.094162 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.094199 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.094289 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.094396 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.094541 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.094707 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.094859 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.095698 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.095827 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.095973 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.096456 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.096559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.096868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.097009 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.097046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.097417 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.097667 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.097796 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.097934 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.097992 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.098378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.098489 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.098951 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.099083 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.099090 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.099206 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.099343 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:21.599319412 +0000 UTC m=+89.380470201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.099437 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.099485 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.100083 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.102947 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103166 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103194 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103209 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.103229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103278 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:21.6032574 +0000 UTC m=+89.384408199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103504 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103520 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103530 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.103565 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:21.603554958 +0000 UTC m=+89.384705747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.103565 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.103736 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.103042 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.103984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.105665 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.105712 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.107572 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.107603 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.107750 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.108101 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.109219 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.109402 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.109597 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.109708 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.113621 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.114047 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.114483 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.114516 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.114223 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.114730 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.114996 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.115148 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.115421 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.116087 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.116258 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.116673 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.116937 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.117142 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.117229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.117241 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.117326 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.117576 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.117584 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.117691 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.118382 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.119352 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.121654 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.123004 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.125892 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.126077 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.128390 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.130519 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.130859 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.135103 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.135134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.135145 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.135161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.135172 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.142666 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178240 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178383 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178396 4830 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178411 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178423 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178435 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178446 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178457 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178469 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178480 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178491 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178502 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178514 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178526 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178537 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178549 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178560 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178571 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178582 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178594 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178605 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178617 4830 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178628 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178639 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178650 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178662 4830 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178673 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178684 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178695 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178706 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178718 4830 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178729 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178740 4830 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178751 4830 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178762 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178802 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178814 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178825 4830 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178836 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178849 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178859 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178870 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178881 4830 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178895 4830 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178907 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178917 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178929 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178939 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178951 4830 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178962 4830 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178972 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178984 4830 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.178995 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179005 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179034 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179045 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179056 4830 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179069 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179079 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179090 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179103 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179114 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179172 4830 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179189 4830 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179201 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179213 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179227 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179241 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179254 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179267 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179278 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179289 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179300 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179311 4830 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179322 4830 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179333 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179344 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179354 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179365 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179376 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179388 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179398 4830 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179410 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179422 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179434 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179445 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179457 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179467 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179478 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179489 4830 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179500 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179511 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179522 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179534 4830 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179545 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179556 4830 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179567 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179578 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179589 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179600 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179611 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179622 4830 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179634 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179644 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179655 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179666 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179677 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179689 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179701 4830 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179713 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179724 4830 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179734 4830 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179745 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179758 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179770 4830 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179782 4830 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179793 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179804 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179819 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179830 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179841 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179852 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179863 4830 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179874 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179885 4830 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179896 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179907 4830 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179919 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179942 4830 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179955 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179966 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179977 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179987 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.179998 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180009 4830 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180037 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180048 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180059 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180069 4830 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180080 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180091 4830 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180102 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180113 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180124 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180135 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180147 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180158 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180169 4830 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180180 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180192 4830 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180204 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180214 4830 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180226 4830 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180237 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180247 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180258 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180269 4830 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180279 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180290 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180300 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180312 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180324 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180335 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180346 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180357 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180368 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180380 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180390 4830 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180402 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180413 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180424 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180436 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180447 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180458 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180470 4830 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180481 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180493 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180504 4830 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180515 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180527 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180538 4830 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180550 4830 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180562 4830 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180574 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180585 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180597 4830 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180608 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180619 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180630 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180645 4830 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180657 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180668 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180680 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180691 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180702 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180830 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.180869 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.222169 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.228967 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.235027 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.237376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.237401 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.237409 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.237422 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.237431 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.247183 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 09:15:21 crc kubenswrapper[4830]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 11 09:15:21 crc kubenswrapper[4830]: set -o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 11 09:15:21 crc kubenswrapper[4830]: source /etc/kubernetes/apiserver-url.env Mar 11 09:15:21 crc kubenswrapper[4830]: else Mar 11 09:15:21 crc kubenswrapper[4830]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 11 09:15:21 crc kubenswrapper[4830]: exit 1 Mar 11 09:15:21 crc kubenswrapper[4830]: fi Mar 11 09:15:21 crc kubenswrapper[4830]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 11 09:15:21 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 09:15:21 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.247480 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.248715 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.248752 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 11 09:15:21 crc kubenswrapper[4830]: W0311 09:15:21.250447 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-266f1e9d69b908fc71bff0247def4f0007b9892c230adf05a1c318dc88bef569 WatchSource:0}: Error finding container 266f1e9d69b908fc71bff0247def4f0007b9892c230adf05a1c318dc88bef569: Status 404 returned error can't find the container with id 266f1e9d69b908fc71bff0247def4f0007b9892c230adf05a1c318dc88bef569 Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.257160 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 09:15:21 crc kubenswrapper[4830]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 09:15:21 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 11 09:15:21 crc kubenswrapper[4830]: set -o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: source "/env/_master" Mar 11 09:15:21 crc kubenswrapper[4830]: set +o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: fi Mar 11 09:15:21 crc kubenswrapper[4830]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 11 09:15:21 crc kubenswrapper[4830]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 11 09:15:21 crc kubenswrapper[4830]: ho_enable="--enable-hybrid-overlay" Mar 11 09:15:21 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 11 09:15:21 crc kubenswrapper[4830]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 11 09:15:21 crc kubenswrapper[4830]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 11 09:15:21 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 09:15:21 crc kubenswrapper[4830]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --webhook-host=127.0.0.1 \ Mar 11 09:15:21 crc kubenswrapper[4830]: --webhook-port=9743 \ Mar 11 09:15:21 crc kubenswrapper[4830]: ${ho_enable} \ Mar 11 09:15:21 crc kubenswrapper[4830]: --enable-interconnect \ Mar 11 09:15:21 crc kubenswrapper[4830]: --disable-approver \ Mar 11 09:15:21 crc kubenswrapper[4830]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --wait-for-kubernetes-api=200s \ Mar 11 09:15:21 crc kubenswrapper[4830]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 11 09:15:21 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 09:15:21 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.260562 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 09:15:21 crc kubenswrapper[4830]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 09:15:21 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 11 09:15:21 crc kubenswrapper[4830]: set -o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: source "/env/_master" Mar 11 09:15:21 crc kubenswrapper[4830]: set +o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: fi Mar 11 09:15:21 crc kubenswrapper[4830]: Mar 11 09:15:21 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 11 09:15:21 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 09:15:21 crc kubenswrapper[4830]: --disable-webhook \ Mar 11 09:15:21 crc kubenswrapper[4830]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 11 09:15:21 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 09:15:21 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.261732 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.312914 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"266f1e9d69b908fc71bff0247def4f0007b9892c230adf05a1c318dc88bef569"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.314063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2bc26c3f6970d35ee87403d1efdfda58db291f1b3acc4a402437aa3031c04b1e"} Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.314299 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 09:15:21 crc kubenswrapper[4830]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 09:15:21 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 11 09:15:21 crc kubenswrapper[4830]: set -o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: source "/env/_master" Mar 11 09:15:21 crc kubenswrapper[4830]: set +o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: fi Mar 11 09:15:21 crc kubenswrapper[4830]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 11 09:15:21 crc kubenswrapper[4830]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 11 09:15:21 crc kubenswrapper[4830]: ho_enable="--enable-hybrid-overlay" Mar 11 09:15:21 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 11 09:15:21 crc kubenswrapper[4830]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 11 09:15:21 crc kubenswrapper[4830]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 11 09:15:21 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 09:15:21 crc kubenswrapper[4830]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --webhook-host=127.0.0.1 \ Mar 11 09:15:21 crc kubenswrapper[4830]: --webhook-port=9743 \ Mar 11 09:15:21 crc kubenswrapper[4830]: ${ho_enable} \ Mar 11 09:15:21 crc kubenswrapper[4830]: --enable-interconnect \ Mar 11 09:15:21 crc kubenswrapper[4830]: --disable-approver \ Mar 11 09:15:21 crc kubenswrapper[4830]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --wait-for-kubernetes-api=200s \ Mar 11 09:15:21 crc kubenswrapper[4830]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 11 09:15:21 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 09:15:21 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.315000 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"694b03e3ba368ed549e5470525bcd6c746fc47cb6b85595602d474b9d3b75a18"} Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.315411 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.317476 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 09:15:21 crc kubenswrapper[4830]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 11 09:15:21 crc kubenswrapper[4830]: set -o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 11 09:15:21 crc kubenswrapper[4830]: source /etc/kubernetes/apiserver-url.env Mar 11 09:15:21 crc kubenswrapper[4830]: else Mar 11 09:15:21 crc kubenswrapper[4830]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 11 09:15:21 crc kubenswrapper[4830]: exit 1 Mar 11 09:15:21 crc kubenswrapper[4830]: fi Mar 11 09:15:21 crc kubenswrapper[4830]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 11 09:15:21 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 09:15:21 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.317654 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 09:15:21 crc kubenswrapper[4830]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 11 09:15:21 crc kubenswrapper[4830]: if [[ -f "/env/_master" ]]; then Mar 11 09:15:21 crc kubenswrapper[4830]: set -o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: source "/env/_master" Mar 11 09:15:21 crc kubenswrapper[4830]: set +o allexport Mar 11 09:15:21 crc kubenswrapper[4830]: fi Mar 11 09:15:21 crc kubenswrapper[4830]: Mar 11 09:15:21 crc kubenswrapper[4830]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 11 09:15:21 crc kubenswrapper[4830]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 11 09:15:21 crc kubenswrapper[4830]: --disable-webhook \ Mar 11 09:15:21 crc kubenswrapper[4830]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 11 09:15:21 crc kubenswrapper[4830]: --loglevel="${LOGLEVEL}" Mar 11 09:15:21 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 11 09:15:21 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.317765 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.319301 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.319948 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.325448 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.336427 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.340368 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.340441 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.340455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.340473 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.340484 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.346235 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.355831 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.363692 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.371312 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.379419 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.389497 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.403149 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.413593 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.422454 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.434056 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.443928 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.443968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.443981 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.443996 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.444006 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.545888 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.545914 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.545922 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.545934 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.545944 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.647964 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.647997 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.648004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.648032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.648040 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.684783 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.684841 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.684863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.684881 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.684903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685007 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685048 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685057 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685097 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:22.685084684 +0000 UTC m=+90.466235373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685397 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:15:22.685386362 +0000 UTC m=+90.466537061 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685465 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685487 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685498 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685561 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:22.685552156 +0000 UTC m=+90.466702845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685601 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685627 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:22.685619948 +0000 UTC m=+90.466770637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685674 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.685704 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:22.68569462 +0000 UTC m=+90.466845319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.750650 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.750679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.750688 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.750702 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.750712 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.853317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.853345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.853355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.853367 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.853377 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.944504 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.945155 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:15:21 crc kubenswrapper[4830]: E0311 09:15:21.945424 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.955679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.955715 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.955728 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.955746 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:21 crc kubenswrapper[4830]: I0311 09:15:21.955759 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:21Z","lastTransitionTime":"2026-03-11T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.059328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.059374 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.059397 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.059416 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.059428 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.162624 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.162696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.162717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.162742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.162763 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.242648 4830 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.266186 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.266255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.266317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.266347 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.266371 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.317919 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.318098 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.369636 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.369673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.369686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.369705 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.369718 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.472906 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.472946 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.472954 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.472969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.472979 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.575492 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.575524 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.575536 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.575552 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.575562 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.678221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.678266 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.678282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.678306 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.678320 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.692932 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.693048 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693099 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:15:24.693068202 +0000 UTC m=+92.474218921 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693162 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.693164 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.693219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.693260 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693176 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693301 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693327 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693356 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:24.693338649 +0000 UTC m=+92.474489338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693378 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:24.69336649 +0000 UTC m=+92.474517189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693377 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693408 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693460 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693201 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693502 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:24.693488963 +0000 UTC m=+92.474639762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.693527 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:24.693516804 +0000 UTC m=+92.474667623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.780792 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.780850 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.780860 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.780876 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.780886 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.884130 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.884176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.884193 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.884217 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.884235 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.933166 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.933222 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.933196 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.933417 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.933501 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:22 crc kubenswrapper[4830]: E0311 09:15:22.933610 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.938893 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.939642 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.941173 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.941911 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.943088 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.943659 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.944413 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.945274 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.945403 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.946008 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.946866 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.947400 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.948519 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.949113 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.949713 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.950849 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.951521 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.952849 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.953497 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.954254 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.955469 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.956156 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.957127 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.957628 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.958010 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.958701 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.959188 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.959769 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.960794 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.961279 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.962248 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.962878 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.963939 4830 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.964055 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.965704 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.966562 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.966966 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.968425 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.969107 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.970055 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.970666 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.971698 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.972140 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.972894 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.973041 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.973600 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.974553 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.974979 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.975970 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.976596 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.977742 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.978293 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.979237 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.979754 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.980816 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.981408 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.981846 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.982761 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.987441 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.987477 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.987487 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.987502 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.987512 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:22Z","lastTransitionTime":"2026-03-11T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.988641 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:22 crc kubenswrapper[4830]: I0311 09:15:22.999203 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.012308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.022768 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.089274 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.089327 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.089337 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.089352 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.089361 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.192299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.192373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.192389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.192410 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.192425 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.296032 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.296071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.296081 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.296094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.296107 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.341149 4830 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.398631 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.398680 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.398692 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.398712 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.398725 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.501822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.501887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.501909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.501939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.501961 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.605465 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.605505 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.605514 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.605528 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.605537 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.708477 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.708551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.708575 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.708605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.708626 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.811486 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.811537 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.811550 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.811567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.811579 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.913899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.913939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.913947 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.913962 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:23 crc kubenswrapper[4830]: I0311 09:15:23.913971 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:23Z","lastTransitionTime":"2026-03-11T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.016780 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.017133 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.017145 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.017164 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.017176 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.118849 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.118884 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.118895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.118910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.118921 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.221672 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.221751 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.221774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.221805 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.221826 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.323123 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.323158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.323169 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.323186 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.323196 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.425755 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.425800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.425815 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.425835 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.425850 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.528599 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.528655 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.528674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.528697 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.528714 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.631167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.631202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.631210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.631225 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.631233 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.710041 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.710128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.710173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.710203 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.710231 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710303 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:15:28.710267551 +0000 UTC m=+96.491418280 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710311 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710390 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710387 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710412 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710406 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710457 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710481 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710394 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:28.710379475 +0000 UTC m=+96.491530204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710431 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710569 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:28.71054826 +0000 UTC m=+96.491698989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710595 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:28.710581791 +0000 UTC m=+96.491732520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.710618 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:28.710605971 +0000 UTC m=+96.491756700 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.733117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.733158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.733169 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.733328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.733341 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.835615 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.835708 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.835736 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.835781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.835794 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.932167 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.932193 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.932213 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.932321 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.932408 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:24 crc kubenswrapper[4830]: E0311 09:15:24.932503 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.938170 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.938210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.938221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.938238 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:24 crc kubenswrapper[4830]: I0311 09:15:24.938253 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:24Z","lastTransitionTime":"2026-03-11T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.040208 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.040260 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.040276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.040296 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.040312 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.142664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.142696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.142704 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.142719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.142727 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.245618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.245692 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.245706 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.245731 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.245745 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.347922 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.347961 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.347970 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.347987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.347997 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.450518 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.450577 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.450593 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.450617 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.450632 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.553395 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.553438 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.553449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.553466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.553479 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.656480 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.656552 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.656568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.656598 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.656614 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.759307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.759365 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.759383 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.759407 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.759424 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.862538 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.862592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.862609 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.862638 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.862654 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.965210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.965254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.965264 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.965281 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:25 crc kubenswrapper[4830]: I0311 09:15:25.965291 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:25Z","lastTransitionTime":"2026-03-11T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.067937 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.067979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.067991 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.068009 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.068042 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.169813 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.169846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.169856 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.169870 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.169881 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.272071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.272131 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.272142 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.272162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.272174 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.374581 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.374624 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.374632 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.374646 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.374655 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.477722 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.477798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.477811 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.477830 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.477872 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.580082 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.580131 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.580144 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.580162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.580174 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.682178 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.682222 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.682231 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.682245 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.682254 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.784323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.784367 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.784380 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.784397 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.784410 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.886573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.886612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.886621 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.886635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.886644 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.931965 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.932028 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.932079 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:26 crc kubenswrapper[4830]: E0311 09:15:26.932141 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:26 crc kubenswrapper[4830]: E0311 09:15:26.932239 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:26 crc kubenswrapper[4830]: E0311 09:15:26.932444 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.989266 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.989355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.989372 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.989390 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:26 crc kubenswrapper[4830]: I0311 09:15:26.989407 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:26Z","lastTransitionTime":"2026-03-11T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.091659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.091744 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.091767 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.091798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.091822 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.194157 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.194226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.194240 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.194255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.194266 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.296448 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.296488 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.296500 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.296517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.296528 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.398696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.398748 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.398760 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.398778 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.398790 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.502606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.502667 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.502686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.502711 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.502729 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.605091 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.605159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.605202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.605226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.605243 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.707595 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.707646 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.707663 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.707710 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.707730 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.809980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.810084 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.810097 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.810123 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.810136 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.912439 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.912484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.912496 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.912514 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:27 crc kubenswrapper[4830]: I0311 09:15:27.912526 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:27Z","lastTransitionTime":"2026-03-11T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.015161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.015218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.015232 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.015254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.015270 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.118340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.118401 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.118420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.118443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.118460 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.221544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.221594 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.221608 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.221626 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.221639 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.324742 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.324805 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.324815 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.324830 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.324841 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.427909 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.427979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.427999 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.428063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.428084 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.531279 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.531359 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.531373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.531392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.531404 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.634510 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.634578 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.634596 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.634621 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.634639 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.738325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.738393 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.738403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.738436 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.738447 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.747131 4830 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.763906 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.764226 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764279 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:15:36.764226787 +0000 UTC m=+104.545377546 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.764361 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.764455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.764522 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764528 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764652 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764675 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:36.764637748 +0000 UTC m=+104.545788477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764536 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764744 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764773 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764747 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:36.764722981 +0000 UTC m=+104.545873900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764827 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:36.764813123 +0000 UTC m=+104.545963842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764869 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764908 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.764933 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.765110 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:36.76508757 +0000 UTC m=+104.546238509 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.841893 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.841955 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.841971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.841991 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.842004 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.932353 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.932387 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.932493 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.932650 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.932774 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:28 crc kubenswrapper[4830]: E0311 09:15:28.932916 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.944714 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.944758 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.944768 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.944788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:28 crc kubenswrapper[4830]: I0311 09:15:28.944799 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:28Z","lastTransitionTime":"2026-03-11T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.047308 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.047379 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.047402 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.047437 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.047461 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.150165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.150214 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.150226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.150243 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.150255 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.252738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.252800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.252819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.252844 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.252861 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.356493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.356538 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.356553 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.356571 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.356585 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.459514 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.459561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.459574 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.459592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.459603 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.562733 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.562797 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.562841 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.562858 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.562870 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.665704 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.665734 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.665741 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.665754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.665762 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.767590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.767690 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.767708 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.767731 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.767747 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.870484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.870534 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.870542 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.870558 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.870570 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.973274 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.973319 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.973335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.973353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:29 crc kubenswrapper[4830]: I0311 09:15:29.973365 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:29Z","lastTransitionTime":"2026-03-11T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.076506 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.076569 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.076582 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.076619 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.076637 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.179151 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.179185 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.179195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.179210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.179221 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.281722 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.281791 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.281814 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.281846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.281867 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.384479 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.384512 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.384522 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.384537 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.384548 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.486840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.486872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.486881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.486895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.486924 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.573518 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.573574 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.573591 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.573616 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.573635 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.589768 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.594369 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.594602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.594781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.594932 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.595102 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.611296 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.615869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.616042 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.616149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.616255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.616345 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.632101 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.637934 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.637984 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.638000 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.638039 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.638056 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.652470 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.655888 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.655929 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.655939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.655955 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.655967 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.665180 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.665284 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.666725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.666833 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.666902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.666990 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.667101 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.769479 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.769517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.769529 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.769548 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.769560 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.871674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.871738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.871754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.871806 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.871822 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.932160 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.932271 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.932783 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.932921 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.933231 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:30 crc kubenswrapper[4830]: E0311 09:15:30.933371 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.974627 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.974666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.974678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.974695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:30 crc kubenswrapper[4830]: I0311 09:15:30.974707 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:30Z","lastTransitionTime":"2026-03-11T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.077147 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.077203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.077217 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.077237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.077252 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.183907 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.183958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.183969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.183983 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.183995 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.286161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.286202 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.286215 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.286233 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.286246 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.388405 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.388461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.388481 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.388507 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.388525 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.491598 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.491642 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.491651 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.491666 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.491676 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.593693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.593773 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.593796 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.593826 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.593881 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.696762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.696815 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.696832 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.696856 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.696873 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.799514 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.799618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.799637 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.799660 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.799677 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.902254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.902302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.902320 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.902341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:31 crc kubenswrapper[4830]: I0311 09:15:31.902357 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:31Z","lastTransitionTime":"2026-03-11T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.005004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.005090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.005108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.005132 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.005148 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.108303 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.108341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.108353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.108369 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.108381 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.211083 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.211150 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.211167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.211197 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.211216 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.314281 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.314345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.314363 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.314389 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.314409 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.342507 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.342550 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.344939 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.365249 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.380452 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.391512 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.407711 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.416113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.416160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.416172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.416190 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.416205 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.423288 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.439406 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.451121 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.463927 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.475112 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.491396 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.505753 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.518582 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.518887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.518991 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.519123 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.519231 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.520169 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.537454 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.549762 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.577528 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.592399 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.621548 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.621674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.621738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.621829 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.621911 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.724727 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.724781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.724798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.724822 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.724840 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.827694 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.827728 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.827739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.827755 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.827767 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.931475 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.931592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.931613 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.931497 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.931640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.931658 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:32Z","lastTransitionTime":"2026-03-11T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.931608 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:32 crc kubenswrapper[4830]: E0311 09:15:32.932668 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:32 crc kubenswrapper[4830]: E0311 09:15:32.933108 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.933201 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:32 crc kubenswrapper[4830]: E0311 09:15:32.933365 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.933457 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:15:32 crc kubenswrapper[4830]: E0311 09:15:32.933694 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 11 09:15:32 crc kubenswrapper[4830]: I0311 09:15:32.986242 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.007117 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.022682 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.033731 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.033762 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.033775 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.033795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.033812 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.049637 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.064165 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.081680 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.104634 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.124718 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.136774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.136895 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.136974 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.137099 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.137212 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.239700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.239763 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.239780 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.239804 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.239822 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.343768 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.343839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.343865 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.343897 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.343918 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.451174 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.451248 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.451271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.451300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.451323 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.554539 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.554588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.554606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.554629 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.554648 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.658348 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.658403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.658423 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.658446 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.658465 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.760948 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.761007 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.761054 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.761080 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.761099 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.864251 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.864531 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.864664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.864814 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.864938 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.967734 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.967988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.968079 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.968149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:33 crc kubenswrapper[4830]: I0311 09:15:33.968210 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:33Z","lastTransitionTime":"2026-03-11T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.070980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.071056 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.071073 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.071096 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.071118 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.173630 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.173678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.173697 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.173719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.173736 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.277228 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.277281 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.277298 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.277323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.277339 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.380980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.381104 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.381134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.381166 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.381189 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.483867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.483958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.483995 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.484086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.484129 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.587008 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.587111 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.587134 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.587163 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.587185 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.691736 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.691798 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.691816 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.691839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.691857 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.793732 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.793820 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.793846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.793877 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.793902 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.896700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.896735 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.896745 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.896759 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.896768 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.932395 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:34 crc kubenswrapper[4830]: E0311 09:15:34.932519 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.932585 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.932670 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:34 crc kubenswrapper[4830]: E0311 09:15:34.932807 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:34 crc kubenswrapper[4830]: E0311 09:15:34.932872 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.998840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.998887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.998910 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.998929 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:34 crc kubenswrapper[4830]: I0311 09:15:34.998940 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:34Z","lastTransitionTime":"2026-03-11T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.101971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.102080 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.102103 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.102158 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.102176 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.204669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.204735 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.204752 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.204782 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.204804 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.307922 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.308065 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.308159 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.308189 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.308247 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.355319 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.386494 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.408007 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.411311 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.411353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.411367 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.411387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.411400 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.428003 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.448749 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.466706 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.489744 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.509644 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.513692 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.513735 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.513749 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.513770 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.513782 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.528278 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:35Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.615814 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.615867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.615881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.615899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.615910 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.718803 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.718842 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.718852 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.718867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.718876 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.821222 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.821271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.821284 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.821301 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.821315 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.924063 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.924128 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.924144 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.924172 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:35 crc kubenswrapper[4830]: I0311 09:15:35.924190 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:35Z","lastTransitionTime":"2026-03-11T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.026466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.026524 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.026540 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.026564 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.026581 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.129752 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.129811 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.129827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.129855 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.129871 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.232181 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.232231 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.232244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.232265 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.232278 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.334618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.334662 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.334675 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.334693 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.334707 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.437132 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.437212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.437237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.437265 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.437286 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.540060 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.540106 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.540119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.540137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.540150 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.642325 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.642392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.642417 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.642442 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.642461 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.745085 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.745137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.745149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.745167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.745178 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.840113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.840206 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840221 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:15:52.840201035 +0000 UTC m=+120.621351734 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.840253 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.840283 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.840306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840326 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840342 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840356 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840387 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840395 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:52.84038449 +0000 UTC m=+120.621535189 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840420 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:52.840410991 +0000 UTC m=+120.621561700 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840506 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840532 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840556 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840590 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840644 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:52.840616676 +0000 UTC m=+120.621767415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.840687 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:52.840660728 +0000 UTC m=+120.621811457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.848470 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.848546 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.848563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.848594 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.848612 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.934269 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.934497 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.935115 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.935178 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.935368 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:36 crc kubenswrapper[4830]: E0311 09:15:36.935438 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.951273 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.951336 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.951346 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.951362 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:36 crc kubenswrapper[4830]: I0311 09:15:36.951374 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:36Z","lastTransitionTime":"2026-03-11T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.054488 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.054717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.054743 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.054776 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.054801 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.156929 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.156975 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.156987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.157003 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.157039 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.259580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.259641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.259659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.259683 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.259703 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.366330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.366400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.366417 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.366442 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.366461 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.471439 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.471491 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.471508 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.471529 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.471546 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.574228 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.574285 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.574307 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.574335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.574504 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.723426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.723460 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.723468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.723483 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.723491 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.826864 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.826936 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.826949 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.826966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.826977 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.928881 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.929010 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.929067 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.929094 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:37 crc kubenswrapper[4830]: I0311 09:15:37.929111 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:37Z","lastTransitionTime":"2026-03-11T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.031391 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.031474 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.031493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.031517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.031534 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.134238 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.134314 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.134338 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.134370 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.134395 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.236788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.236866 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.236891 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.236918 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.236936 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.340652 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.340719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.340739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.340764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.340810 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.443122 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.443226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.443254 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.443287 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.443311 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.545517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.545555 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.545568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.545587 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.545600 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.648375 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.648436 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.648456 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.648482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.648500 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.751274 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.751331 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.751344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.751363 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.751378 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.854114 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.854171 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.854187 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.854213 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.854231 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.932348 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.932451 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:38 crc kubenswrapper[4830]: E0311 09:15:38.932532 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.932572 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:38 crc kubenswrapper[4830]: E0311 09:15:38.932764 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:38 crc kubenswrapper[4830]: E0311 09:15:38.932910 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.957124 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.957196 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.957226 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.957256 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:38 crc kubenswrapper[4830]: I0311 09:15:38.957278 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:38Z","lastTransitionTime":"2026-03-11T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.059896 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.059969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.059994 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.060053 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.060076 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.163525 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.163617 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.163635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.163690 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.163709 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.266524 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.266584 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.266606 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.266631 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.266651 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.369641 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.369696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.369713 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.369739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.369756 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.472784 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.472851 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.472868 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.472893 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.472911 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.575871 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.575925 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.575943 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.575966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.575983 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.679859 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.679916 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.679939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.679971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.679994 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.782563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.782633 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.782656 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.782687 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.782713 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.886219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.886282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.886306 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.886335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.886355 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.989257 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.989318 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.989335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.989359 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:39 crc kubenswrapper[4830]: I0311 09:15:39.989376 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:39Z","lastTransitionTime":"2026-03-11T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.092246 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.092300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.092317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.092343 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.092360 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.195723 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.195786 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.195802 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.195827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.195845 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.298756 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.298807 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.298825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.298854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.298871 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.401517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.401597 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.401621 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.401653 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.401716 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.505219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.505665 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.505865 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.505998 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.506239 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.609276 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.609357 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.609382 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.609412 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.609435 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.712315 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.712363 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.712376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.712394 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.712406 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.815803 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.815869 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.815889 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.815917 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.815935 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.919788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.919872 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.919908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.920049 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.920076 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:40Z","lastTransitionTime":"2026-03-11T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.931459 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.931468 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:40 crc kubenswrapper[4830]: I0311 09:15:40.931686 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:40 crc kubenswrapper[4830]: E0311 09:15:40.931831 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:40 crc kubenswrapper[4830]: E0311 09:15:40.932056 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:40 crc kubenswrapper[4830]: E0311 09:15:40.932355 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.023738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.024059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.024234 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.024391 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.024525 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.062291 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.062439 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.062502 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.062573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.062635 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: E0311 09:15:41.079194 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.083989 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.084111 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.084176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.084241 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.084306 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: E0311 09:15:41.101861 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.106557 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.106736 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.106823 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.106912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.107005 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: E0311 09:15:41.120621 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.124908 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.124969 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.124988 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.125013 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.125068 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: E0311 09:15:41.143652 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.148894 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.149057 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.149289 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.149517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.149614 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: E0311 09:15:41.166804 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:41 crc kubenswrapper[4830]: E0311 09:15:41.167231 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.168967 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.169071 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.169096 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.169122 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.169141 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.272494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.272551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.272568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.272592 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.272612 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.375408 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.375771 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.376068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.376270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.376464 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.479757 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.479806 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.479815 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.479829 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.479838 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.581771 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.581833 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.581851 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.581889 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.581908 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.685253 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.685310 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.685327 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.685350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.685368 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.787679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.787746 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.787764 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.787789 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.787812 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.890477 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.890534 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.890551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.890574 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.890590 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.993300 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.993355 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.993372 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.993398 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:41 crc kubenswrapper[4830]: I0311 09:15:41.993415 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:41Z","lastTransitionTime":"2026-03-11T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.096447 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.096519 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.096544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.096575 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.096601 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.199979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.200065 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.200086 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.200110 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.200127 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.303522 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.303590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.303612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.303643 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.303665 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.359464 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-c4pts"] Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.359871 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.363254 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.363426 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.363619 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.380078 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.397558 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.406679 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.406731 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.406749 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.406774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.406791 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.411333 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.435390 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.455227 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.472664 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.486390 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.495821 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e816263-8213-4e07-b0e1-a5963aa3381c-hosts-file\") pod \"node-resolver-c4pts\" (UID: \"5e816263-8213-4e07-b0e1-a5963aa3381c\") " pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.495913 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgfsq\" (UniqueName: \"kubernetes.io/projected/5e816263-8213-4e07-b0e1-a5963aa3381c-kube-api-access-mgfsq\") pod \"node-resolver-c4pts\" (UID: \"5e816263-8213-4e07-b0e1-a5963aa3381c\") " pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.505499 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.509551 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.509746 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.509879 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.509998 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.510139 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.522846 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.596746 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e816263-8213-4e07-b0e1-a5963aa3381c-hosts-file\") pod \"node-resolver-c4pts\" (UID: \"5e816263-8213-4e07-b0e1-a5963aa3381c\") " pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.596980 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5e816263-8213-4e07-b0e1-a5963aa3381c-hosts-file\") pod \"node-resolver-c4pts\" (UID: \"5e816263-8213-4e07-b0e1-a5963aa3381c\") " pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.597142 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgfsq\" (UniqueName: \"kubernetes.io/projected/5e816263-8213-4e07-b0e1-a5963aa3381c-kube-api-access-mgfsq\") pod \"node-resolver-c4pts\" (UID: \"5e816263-8213-4e07-b0e1-a5963aa3381c\") " pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.612987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.613152 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.613400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.613616 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.613811 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.632908 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgfsq\" (UniqueName: \"kubernetes.io/projected/5e816263-8213-4e07-b0e1-a5963aa3381c-kube-api-access-mgfsq\") pod \"node-resolver-c4pts\" (UID: \"5e816263-8213-4e07-b0e1-a5963aa3381c\") " pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.682686 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c4pts" Mar 11 09:15:42 crc kubenswrapper[4830]: W0311 09:15:42.704752 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e816263_8213_4e07_b0e1_a5963aa3381c.slice/crio-422c9e647f36b6f7b9ece2393bd9a88c44aad007c7dc01bbe0cf342c3e9eb819 WatchSource:0}: Error finding container 422c9e647f36b6f7b9ece2393bd9a88c44aad007c7dc01bbe0cf342c3e9eb819: Status 404 returned error can't find the container with id 422c9e647f36b6f7b9ece2393bd9a88c44aad007c7dc01bbe0cf342c3e9eb819 Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.718468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.718527 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.718544 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.718569 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.718586 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.730919 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8w98l"] Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.731207 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vgww4"] Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.731685 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-p7jq8"] Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.731944 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.732440 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.732744 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.733996 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.734522 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.735694 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736034 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736054 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736080 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736159 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736184 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736210 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736305 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736361 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.736498 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.748473 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.771487 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.788702 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800068 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-hostroot\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800111 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rlx\" (UniqueName: \"kubernetes.io/projected/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-kube-api-access-p9rlx\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800137 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800181 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-cni-binary-copy\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800203 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-daemon-config\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800229 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-etc-kubernetes\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800346 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cni-binary-copy\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800443 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-os-release\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800494 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-k8s-cni-cncf-io\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-cni-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800701 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2bdde2fd-3db4-4b41-9287-58960dcab5d9-rootfs\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800750 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bdde2fd-3db4-4b41-9287-58960dcab5d9-proxy-tls\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800799 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbzc8\" (UniqueName: \"kubernetes.io/projected/2bdde2fd-3db4-4b41-9287-58960dcab5d9-kube-api-access-rbzc8\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800848 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bdde2fd-3db4-4b41-9287-58960dcab5d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800930 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-cni-multus\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.800981 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-kubelet\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801061 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-system-cni-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801113 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-socket-dir-parent\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801158 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-conf-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801208 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-system-cni-dir\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801251 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-os-release\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801296 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-netns\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801338 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cnibin\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801382 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpnz7\" (UniqueName: \"kubernetes.io/projected/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-kube-api-access-wpnz7\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801426 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-cnibin\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801456 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-cni-bin\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.801481 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-multus-certs\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.808563 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.821624 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.821664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.821682 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.821702 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.821716 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.835274 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.849961 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.862659 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.875200 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902544 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-cni-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902593 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2bdde2fd-3db4-4b41-9287-58960dcab5d9-rootfs\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902614 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bdde2fd-3db4-4b41-9287-58960dcab5d9-proxy-tls\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902638 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbzc8\" (UniqueName: \"kubernetes.io/projected/2bdde2fd-3db4-4b41-9287-58960dcab5d9-kube-api-access-rbzc8\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bdde2fd-3db4-4b41-9287-58960dcab5d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902684 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-cni-multus\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902704 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-kubelet\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902732 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-system-cni-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902735 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-cni-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902752 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-socket-dir-parent\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902815 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-conf-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902819 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-socket-dir-parent\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902833 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-os-release\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2bdde2fd-3db4-4b41-9287-58960dcab5d9-rootfs\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902883 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-system-cni-dir\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902864 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-system-cni-dir\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902910 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-netns\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902926 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cnibin\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902944 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpnz7\" (UniqueName: \"kubernetes.io/projected/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-kube-api-access-wpnz7\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902966 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-cnibin\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902982 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-cni-bin\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.902997 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-multus-certs\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903028 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903046 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-hostroot\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903065 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rlx\" (UniqueName: \"kubernetes.io/projected/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-kube-api-access-p9rlx\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-cni-binary-copy\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903105 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-daemon-config\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903149 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-etc-kubernetes\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903166 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cni-binary-copy\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903180 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903196 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-os-release\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-k8s-cni-cncf-io\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903257 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-k8s-cni-cncf-io\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903279 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-conf-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-os-release\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903335 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-netns\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903353 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cnibin\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903643 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-cnibin\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903667 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-cni-bin\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.903685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-run-multus-certs\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.904080 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-hostroot\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.904540 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.905202 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-cni-binary-copy\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.905291 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-etc-kubernetes\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.905434 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-multus-daemon-config\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.905502 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-os-release\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.905514 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-kubelet\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.905567 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-host-var-lib-cni-multus\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.905621 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-system-cni-dir\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.906139 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bdde2fd-3db4-4b41-9287-58960dcab5d9-mcd-auth-proxy-config\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.906189 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.906345 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-cni-binary-copy\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.906655 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bdde2fd-3db4-4b41-9287-58960dcab5d9-proxy-tls\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.916409 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.926806 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.926846 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.926856 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.926868 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.926878 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:42Z","lastTransitionTime":"2026-03-11T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.935204 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:42 crc kubenswrapper[4830]: E0311 09:15:42.935351 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.935427 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:42 crc kubenswrapper[4830]: E0311 09:15:42.935485 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.936008 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:42 crc kubenswrapper[4830]: E0311 09:15:42.936129 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.937281 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbzc8\" (UniqueName: \"kubernetes.io/projected/2bdde2fd-3db4-4b41-9287-58960dcab5d9-kube-api-access-rbzc8\") pod \"machine-config-daemon-p7jq8\" (UID: \"2bdde2fd-3db4-4b41-9287-58960dcab5d9\") " pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.941896 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpnz7\" (UniqueName: \"kubernetes.io/projected/2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde-kube-api-access-wpnz7\") pod \"multus-additional-cni-plugins-vgww4\" (UID: \"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\") " pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.944410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rlx\" (UniqueName: \"kubernetes.io/projected/75fdb109-77cf-4d97-ac3c-6f3139b3bb7a-kube-api-access-p9rlx\") pod \"multus-8w98l\" (UID: \"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\") " pod="openshift-multus/multus-8w98l" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.945092 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.957966 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.967935 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.977429 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:42 crc kubenswrapper[4830]: I0311 09:15:42.995936 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.008435 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.019352 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.029648 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.029694 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.029703 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.029717 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.029727 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.033803 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.047915 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.059403 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.061566 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: W0311 09:15:43.070996 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdde2fd_3db4_4b41_9287_58960dcab5d9.slice/crio-831d1a8a111d1fbdaa60b6af3fd07e3c39eea085aab82acb4668a1eb028a3551 WatchSource:0}: Error finding container 831d1a8a111d1fbdaa60b6af3fd07e3c39eea085aab82acb4668a1eb028a3551: Status 404 returned error can't find the container with id 831d1a8a111d1fbdaa60b6af3fd07e3c39eea085aab82acb4668a1eb028a3551 Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.074492 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8w98l" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.075989 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.084783 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vgww4" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.090388 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.101124 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: W0311 09:15:43.105429 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d4bb584_ca5b_4e05_b3be_a4a3c1a92fde.slice/crio-5aa0174c4c875c9674f0fb5e163dec3ea23a85c48fb9d6302813c9cb8d1574f8 WatchSource:0}: Error finding container 5aa0174c4c875c9674f0fb5e163dec3ea23a85c48fb9d6302813c9cb8d1574f8: Status 404 returned error can't find the container with id 5aa0174c4c875c9674f0fb5e163dec3ea23a85c48fb9d6302813c9cb8d1574f8 Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.113564 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gtl5j"] Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.114526 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.118478 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.118495 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.118570 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.118748 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.118859 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.118966 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.119082 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.119186 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.134455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.134490 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.134501 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.134517 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.134531 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.136221 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.160435 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.175980 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.191956 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.203678 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-ovn\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205454 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-bin\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205476 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-netns\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205514 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-netd\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205548 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-slash\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205608 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-log-socket\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205624 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqfbx\" (UniqueName: \"kubernetes.io/projected/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-kube-api-access-wqfbx\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205648 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-systemd-units\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205671 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-node-log\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205686 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-var-lib-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205727 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205763 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-ovn-kubernetes\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205788 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-config\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-script-lib\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205835 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-kubelet\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205855 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-etc-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205878 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205899 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-env-overrides\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205919 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-systemd\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.205952 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovn-node-metrics-cert\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.219367 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.231974 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.240827 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.240858 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.240867 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.240880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.240890 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.245116 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.264343 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.275706 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.290475 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.302895 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.306333 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-config\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307038 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-ovn-kubernetes\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-script-lib\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307329 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-kubelet\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307412 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-etc-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307485 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-systemd\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307610 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-env-overrides\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307682 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovn-node-metrics-cert\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307755 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-ovn\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-script-lib\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307869 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-systemd\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307824 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-bin\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307940 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-netns\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307964 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-netd\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307984 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-slash\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308004 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-log-socket\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308041 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqfbx\" (UniqueName: \"kubernetes.io/projected/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-kube-api-access-wqfbx\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-systemd-units\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308092 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-node-log\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308117 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-var-lib-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308142 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308205 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.307083 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-ovn-kubernetes\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308247 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-kubelet\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.306992 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-config\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308286 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-etc-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308343 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-netns\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-netd\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308401 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-slash\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308428 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-log-socket\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308645 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-systemd-units\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-node-log\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308714 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-var-lib-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.308806 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-bin\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.309129 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-env-overrides\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.309174 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-openvswitch\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.309197 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-ovn\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.311666 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovn-node-metrics-cert\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.321347 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.325699 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqfbx\" (UniqueName: \"kubernetes.io/projected/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-kube-api-access-wqfbx\") pod \"ovnkube-node-gtl5j\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.342221 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.343671 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.343800 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.343887 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.343978 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.344078 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.356013 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.370603 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.383122 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerStarted","Data":"28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.383185 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerStarted","Data":"5aa0174c4c875c9674f0fb5e163dec3ea23a85c48fb9d6302813c9cb8d1574f8"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.384491 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerStarted","Data":"4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.384514 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerStarted","Data":"c17f747f1e13982bde13e0a38542107c5e1b6d2461ca6a7f949297cca0614de1"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.386393 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.386449 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.386463 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"831d1a8a111d1fbdaa60b6af3fd07e3c39eea085aab82acb4668a1eb028a3551"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.387529 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c4pts" event={"ID":"5e816263-8213-4e07-b0e1-a5963aa3381c","Type":"ContainerStarted","Data":"61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.387556 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c4pts" event={"ID":"5e816263-8213-4e07-b0e1-a5963aa3381c","Type":"ContainerStarted","Data":"422c9e647f36b6f7b9ece2393bd9a88c44aad007c7dc01bbe0cf342c3e9eb819"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.389781 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.401764 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.413617 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.424713 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.438221 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.448367 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.448403 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.448415 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.448432 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.448443 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.450617 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.451392 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: W0311 09:15:43.461381 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13b9ac6c_3f4b_4dd4_b91d_7173880939d8.slice/crio-e56274693a8f57aeb7ae06293937d218211f4df2f998847c632307b0678b6389 WatchSource:0}: Error finding container e56274693a8f57aeb7ae06293937d218211f4df2f998847c632307b0678b6389: Status 404 returned error can't find the container with id e56274693a8f57aeb7ae06293937d218211f4df2f998847c632307b0678b6389 Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.472533 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.484706 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.505505 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.520457 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.534905 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.548589 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.550350 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.550376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.550384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.550400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.550411 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.562137 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.576926 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.597633 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.607467 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.619710 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.636844 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.651139 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.653640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.653686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.653700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.653720 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.653732 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.665945 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.680178 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.756434 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.756482 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.756495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.756513 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.756524 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.859304 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.859354 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.859368 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.859387 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.859399 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.963107 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.963140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.963149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.963162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:43 crc kubenswrapper[4830]: I0311 09:15:43.963173 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:43Z","lastTransitionTime":"2026-03-11T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.066540 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.066587 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.066598 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.066617 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.066630 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.169376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.169448 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.169468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.169495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.169758 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.272502 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.273140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.273149 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.273162 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.273170 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.375040 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.375084 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.375095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.375115 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.375128 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.391642 4830 generic.go:334] "Generic (PLEG): container finished" podID="2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde" containerID="28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7" exitCode=0 Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.391699 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerDied","Data":"28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.397650 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0" exitCode=0 Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.397695 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.397727 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"e56274693a8f57aeb7ae06293937d218211f4df2f998847c632307b0678b6389"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.415490 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.435395 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.451745 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.475322 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.478667 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.478719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.478739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.478768 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.478788 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.502051 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.528844 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.551207 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.574215 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.589928 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.589968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.589980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.589997 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.590011 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.603264 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.630759 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.655234 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.674449 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.688261 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.693695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.693739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.693752 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.693769 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.693783 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.710267 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.724393 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.743860 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.762074 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.777474 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.794432 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.796646 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.796673 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.796681 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.796696 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.796707 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.818965 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.832688 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.847790 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.859532 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.873286 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.887086 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.896924 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:44Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.898695 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.898725 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.898738 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.898754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.898767 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:44Z","lastTransitionTime":"2026-03-11T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.931582 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.931704 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:44 crc kubenswrapper[4830]: E0311 09:15:44.931769 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:44 crc kubenswrapper[4830]: E0311 09:15:44.931700 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:44 crc kubenswrapper[4830]: I0311 09:15:44.931582 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:44 crc kubenswrapper[4830]: E0311 09:15:44.932221 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.001312 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.001363 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.001375 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.001393 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.001407 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.103732 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.103770 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.103779 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.103795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.103806 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.206979 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.207004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.207255 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.207274 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.207283 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.310205 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.310267 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.310282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.310305 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.310321 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.403932 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.403990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.404008 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.404048 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.404064 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.404079 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.406696 4830 generic.go:334] "Generic (PLEG): container finished" podID="2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde" containerID="a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d" exitCode=0 Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.406755 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerDied","Data":"a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.413563 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.413588 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.413595 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.413607 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.413616 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.425644 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.437106 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.450490 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.477230 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.495905 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.515837 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.516515 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.516542 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.516549 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.516565 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.516574 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.530818 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.547568 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.567400 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.582231 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.604782 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.619068 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.619113 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.619122 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.619140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.619151 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.624007 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.635731 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:45Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.724706 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.724767 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.724785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.724808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.724825 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.828250 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.828309 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.828323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.828344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.828359 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.931488 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.931536 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.931548 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.931568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:45 crc kubenswrapper[4830]: I0311 09:15:45.931582 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:45Z","lastTransitionTime":"2026-03-11T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.034732 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.034796 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.034811 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.034834 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.034849 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.137729 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.137771 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.137784 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.137802 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.137815 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.241290 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.241335 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.241345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.241364 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.241375 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.344580 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.344633 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.344651 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.344674 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.344690 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.414819 4830 generic.go:334] "Generic (PLEG): container finished" podID="2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde" containerID="c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239" exitCode=0 Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.414899 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerDied","Data":"c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.448393 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.448459 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.448481 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.448510 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.448532 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.454479 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.477251 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.496501 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.525594 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.545927 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.555212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.555272 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.555291 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.555316 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.555333 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.568193 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.587363 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.608416 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.630903 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.649259 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.658576 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.659137 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.659165 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.659197 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.659218 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.673702 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.697068 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.721362 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:46Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.763299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.763449 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.763468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.763494 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.763514 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.865966 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.865996 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.866004 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.866040 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.866051 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.933262 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.933770 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:46 crc kubenswrapper[4830]: E0311 09:15:46.933905 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.934457 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:46 crc kubenswrapper[4830]: E0311 09:15:46.934565 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.934827 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:46 crc kubenswrapper[4830]: E0311 09:15:46.934961 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.970659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.970715 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.970733 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.970756 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:46 crc kubenswrapper[4830]: I0311 09:15:46.970773 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:46Z","lastTransitionTime":"2026-03-11T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.074804 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.074854 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.074863 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.074880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.074895 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.177468 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.177514 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.177535 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.177560 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.177578 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.284602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.284659 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.284678 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.284700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.284722 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.387294 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.387344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.387356 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.387376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.387388 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.421913 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.428817 4830 generic.go:334] "Generic (PLEG): container finished" podID="2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde" containerID="2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c" exitCode=0 Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.428886 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerDied","Data":"2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.431932 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.440402 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.440852 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.456980 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.469517 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.478545 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.490006 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.490893 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.490960 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.490975 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.491001 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.491043 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.508982 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.520008 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.545877 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.569374 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.589899 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.595179 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.595234 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.595253 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.595278 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.595297 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.604426 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.624719 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.639391 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.659404 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.677714 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.693503 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.698076 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.698150 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.698167 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.698194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.698212 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.714166 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.745593 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.768056 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.784890 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.801119 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.801175 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.801194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.801217 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.801236 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.807800 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.823342 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.898416 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.909344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.909754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.909765 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.909781 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.909794 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:47Z","lastTransitionTime":"2026-03-11T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.916161 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.945548 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.962784 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:47 crc kubenswrapper[4830]: I0311 09:15:47.984430 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:47Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.013766 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.013818 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.013833 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.013852 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.013867 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.117992 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.118120 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.118140 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.118180 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.118198 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.221174 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.221210 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.221219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.221237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.221247 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.324529 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.324572 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.324582 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.324602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.324614 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.433249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.433317 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.433337 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.433363 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.433389 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.449916 4830 generic.go:334] "Generic (PLEG): container finished" podID="2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde" containerID="d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893" exitCode=0 Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.449993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerDied","Data":"d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.482835 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.521159 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.538271 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.538341 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.538363 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.538393 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.538414 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.542351 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.556409 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.569716 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.580243 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.596416 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.614387 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.630284 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.640768 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.640819 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.640831 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.640850 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.640862 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.643424 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.655998 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.681618 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.698954 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:48Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.744174 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.744214 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.744227 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.744244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.744257 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.847518 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.847573 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.847590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.847613 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.847632 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.931908 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.932059 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.931931 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:48 crc kubenswrapper[4830]: E0311 09:15:48.932151 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:48 crc kubenswrapper[4830]: E0311 09:15:48.932242 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:48 crc kubenswrapper[4830]: E0311 09:15:48.932419 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.950414 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.950466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.950484 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.950508 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:48 crc kubenswrapper[4830]: I0311 09:15:48.950526 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:48Z","lastTransitionTime":"2026-03-11T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.053737 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.053808 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.053830 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.053861 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.053886 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.156834 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.156902 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.156919 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.156943 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.156960 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.191602 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-cxfbj"] Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.192275 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.194457 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.195106 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.195509 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.199344 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.212428 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.232175 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.248706 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.259769 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.259839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.259856 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.259880 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.259897 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.280828 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.299266 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.308340 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/428c70b1-9ac3-45a8-9482-b3908f7eed8b-host\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.308425 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/428c70b1-9ac3-45a8-9482-b3908f7eed8b-serviceca\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.308506 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npsrs\" (UniqueName: \"kubernetes.io/projected/428c70b1-9ac3-45a8-9482-b3908f7eed8b-kube-api-access-npsrs\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.318248 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.331378 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.343874 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.359461 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.363553 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.363579 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.363590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.363608 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.363622 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.397677 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.409922 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/428c70b1-9ac3-45a8-9482-b3908f7eed8b-host\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.410090 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/428c70b1-9ac3-45a8-9482-b3908f7eed8b-serviceca\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.410126 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/428c70b1-9ac3-45a8-9482-b3908f7eed8b-host\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.410197 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npsrs\" (UniqueName: \"kubernetes.io/projected/428c70b1-9ac3-45a8-9482-b3908f7eed8b-kube-api-access-npsrs\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.411916 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/428c70b1-9ac3-45a8-9482-b3908f7eed8b-serviceca\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.415393 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.434713 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npsrs\" (UniqueName: \"kubernetes.io/projected/428c70b1-9ac3-45a8-9482-b3908f7eed8b-kube-api-access-npsrs\") pod \"node-ca-cxfbj\" (UID: \"428c70b1-9ac3-45a8-9482-b3908f7eed8b\") " pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.441762 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.460636 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.464431 4830 generic.go:334] "Generic (PLEG): container finished" podID="2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde" containerID="f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f" exitCode=0 Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.464513 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerDied","Data":"f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.468244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.468299 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.468319 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.468345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.468365 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.474971 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.491198 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.510287 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.513523 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cxfbj" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.531885 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: W0311 09:15:49.548307 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428c70b1_9ac3_45a8_9482_b3908f7eed8b.slice/crio-d0d9d037800c791a911052a4848647dd29d22935f07fec548e509330e892c161 WatchSource:0}: Error finding container d0d9d037800c791a911052a4848647dd29d22935f07fec548e509330e892c161: Status 404 returned error can't find the container with id d0d9d037800c791a911052a4848647dd29d22935f07fec548e509330e892c161 Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.550763 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.570666 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.576450 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.576550 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.576570 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.576600 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.576617 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.587375 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.604970 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.621113 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.641573 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.653257 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.668147 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.678980 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.679007 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.679136 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.679153 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.679165 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.691983 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.704599 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.717115 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:49Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.781680 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.781754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.781768 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.781785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.781797 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.884201 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.884234 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.884242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.884257 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.884268 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.987353 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.987418 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.987435 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.987461 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:49 crc kubenswrapper[4830]: I0311 09:15:49.987479 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:49Z","lastTransitionTime":"2026-03-11T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.090153 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.090193 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.090203 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.090218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.090228 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.192108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.192161 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.192173 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.192195 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.192208 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.294194 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.294236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.294246 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.294261 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.294272 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.398124 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.398190 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.398211 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.398237 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.398256 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.471917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cxfbj" event={"ID":"428c70b1-9ac3-45a8-9482-b3908f7eed8b","Type":"ContainerStarted","Data":"7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.471998 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cxfbj" event={"ID":"428c70b1-9ac3-45a8-9482-b3908f7eed8b","Type":"ContainerStarted","Data":"d0d9d037800c791a911052a4848647dd29d22935f07fec548e509330e892c161"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.477734 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" event={"ID":"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde","Type":"ContainerStarted","Data":"cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.484092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.484574 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.501328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.501388 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.501405 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.501428 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.501445 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.509639 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.529555 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.533202 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.551062 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.568102 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.588371 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.605598 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.605664 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.605691 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.605721 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.605744 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.615252 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.646446 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.661969 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.675210 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.688535 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.701196 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.709295 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.709344 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.709356 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.709376 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.709391 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.715950 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.730674 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.747255 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.768349 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.785774 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.800558 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.812076 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.812108 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.812117 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.812132 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.812141 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.816413 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.837549 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.855634 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.893265 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.910435 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.915265 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.915314 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.915323 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.915336 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.915345 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:50Z","lastTransitionTime":"2026-03-11T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.932080 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.932152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.932261 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.932299 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:50 crc kubenswrapper[4830]: E0311 09:15:50.932266 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:50 crc kubenswrapper[4830]: E0311 09:15:50.932503 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:50 crc kubenswrapper[4830]: E0311 09:15:50.932597 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.945466 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.960200 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.980045 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:50 crc kubenswrapper[4830]: I0311 09:15:50.996861 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:50Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.009626 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.017536 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.017618 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.017635 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.017652 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.017663 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.120345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.120402 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.120419 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.120443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.120461 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.182221 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.182277 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.182294 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.182322 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.182341 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: E0311 09:15:51.202436 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.209615 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.209671 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.209690 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.209732 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.209772 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: E0311 09:15:51.233452 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.238848 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.238911 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.238932 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.238958 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.238977 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: E0311 09:15:51.256645 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.262345 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.262428 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.262443 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.262464 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.262480 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: E0311 09:15:51.279610 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.285173 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.285207 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.285219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.285238 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.285250 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: E0311 09:15:51.299868 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: E0311 09:15:51.300208 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.302670 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.302724 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.302740 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.302774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.302795 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.405561 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.405624 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.405640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.405669 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.405694 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.492169 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.492239 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.509413 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.509466 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.509483 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.509508 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.509525 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.527915 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.568470 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.591709 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.612516 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.612570 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.612590 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.612682 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.612703 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.613472 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.630382 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.650711 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.672185 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.695547 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.717643 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.718095 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.718218 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.718242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.718266 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.718284 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.738154 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.752961 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.765339 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.780890 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.795408 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.811852 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:51Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.821269 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.821331 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.821361 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.821391 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.821414 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.923646 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.923703 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.923718 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.923737 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:51 crc kubenswrapper[4830]: I0311 09:15:51.923749 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:51Z","lastTransitionTime":"2026-03-11T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.027196 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.027250 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.027267 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.027290 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.027307 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.130340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.130373 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.130384 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.130400 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.130410 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.232610 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.232646 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.232658 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.232700 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.232715 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.335423 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.335754 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.335763 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.335776 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.335785 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.441261 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.441312 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.441324 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.441340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.441358 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.543493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.543525 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.543534 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.543550 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.543576 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.645987 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.646018 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.646029 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.646059 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.646070 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.748127 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.748168 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.748180 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.748198 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.748210 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.843812 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.843967 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.844027 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.844112 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844162 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844183 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844195 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844223 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844161 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:16:24.844116901 +0000 UTC m=+152.625267630 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844280 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.844312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844397 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:24.844352098 +0000 UTC m=+152.625502827 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844575 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844661 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844692 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844750 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:24.844514482 +0000 UTC m=+152.625665211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844821 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:24.84480428 +0000 UTC m=+152.625955069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.844921 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:24.844901563 +0000 UTC m=+152.626052382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.853882 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.854043 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.854065 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.854090 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.854110 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:15:52Z","lastTransitionTime":"2026-03-11T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.932231 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.932263 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.932580 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.932632 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.932835 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.933002 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.953696 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:52Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:52 crc kubenswrapper[4830]: E0311 09:15:52.954762 4830 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.971430 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:52Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:52 crc kubenswrapper[4830]: I0311 09:15:52.987866 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:52Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.003551 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.030790 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: E0311 09:15:53.032320 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.055858 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.083220 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.104111 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.119005 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.137380 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.151526 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.164646 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.187856 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.203211 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.537200 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/0.log" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.542882 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08" exitCode=1 Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.542962 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08"} Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.544322 4830 scope.go:117] "RemoveContainer" containerID="3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.566146 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.586599 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.612871 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.640499 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:52Z\\\",\\\"message\\\":\\\" *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 09:15:52.650637 6671 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:52.650658 6671 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:52.650661 6671 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651020 6671 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651621 6671 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 09:15:52.651666 6671 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 09:15:52.651677 6671 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 09:15:52.651689 6671 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:52.651742 6671 factory.go:656] Stopping watch factory\\\\nI0311 09:15:52.651758 6671 ovnkube.go:599] Stopped ovnkube\\\\nI0311 09:15:52.651778 6671 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:52.651788 6671 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:52.651794 6671 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.659231 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.679467 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.699940 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.717124 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.740142 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.761741 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.783153 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.815422 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.833352 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:53 crc kubenswrapper[4830]: I0311 09:15:53.845212 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.546348 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/0.log" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.547880 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6"} Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.548646 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.560429 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.571580 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.583862 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.604960 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.621082 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.635836 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.647817 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.659306 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:54 crc kubenswrapper[4830]: I0311 09:15:54.676132 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:54Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.010792 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:55 crc kubenswrapper[4830]: E0311 09:15:55.011096 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.011164 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:55 crc kubenswrapper[4830]: E0311 09:15:55.011277 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.012361 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:55 crc kubenswrapper[4830]: E0311 09:15:55.013385 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.034416 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:52Z\\\",\\\"message\\\":\\\" *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 09:15:52.650637 6671 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:52.650658 6671 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:52.650661 6671 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651020 6671 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651621 6671 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 09:15:52.651666 6671 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 09:15:52.651677 6671 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 09:15:52.651689 6671 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:52.651742 6671 factory.go:656] Stopping watch factory\\\\nI0311 09:15:52.651758 6671 ovnkube.go:599] Stopped ovnkube\\\\nI0311 09:15:52.651778 6671 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:52.651788 6671 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:52.651794 6671 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.048919 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.061596 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.070603 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.080211 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.189157 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q"] Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.189630 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.191926 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.196313 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.212857 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.212902 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.212971 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gpvq\" (UniqueName: \"kubernetes.io/projected/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-kube-api-access-5gpvq\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.213007 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.219407 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.233075 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.245092 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.256970 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.273339 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.285410 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.300387 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.314203 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gpvq\" (UniqueName: \"kubernetes.io/projected/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-kube-api-access-5gpvq\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.314256 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.314786 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.314846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.314879 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.315324 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.322164 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.329542 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:52Z\\\",\\\"message\\\":\\\" *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 09:15:52.650637 6671 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:52.650658 6671 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:52.650661 6671 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651020 6671 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651621 6671 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 09:15:52.651666 6671 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 09:15:52.651677 6671 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 09:15:52.651689 6671 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:52.651742 6671 factory.go:656] Stopping watch factory\\\\nI0311 09:15:52.651758 6671 ovnkube.go:599] Stopped ovnkube\\\\nI0311 09:15:52.651778 6671 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:52.651788 6671 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:52.651794 6671 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.334034 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gpvq\" (UniqueName: \"kubernetes.io/projected/9e6a369e-c5b9-4911-83bc-1ea3a21a472e-kube-api-access-5gpvq\") pod \"ovnkube-control-plane-749d76644c-4pd2q\" (UID: \"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.344772 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.356408 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.369388 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.379364 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.394316 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.409785 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.422613 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.511552 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" Mar 11 09:15:55 crc kubenswrapper[4830]: W0311 09:15:55.532860 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6a369e_c5b9_4911_83bc_1ea3a21a472e.slice/crio-5b8de20d89eae1b8fe90aad01ef51d46a5dd33258263201fb68efe3a8b81a6d5 WatchSource:0}: Error finding container 5b8de20d89eae1b8fe90aad01ef51d46a5dd33258263201fb68efe3a8b81a6d5: Status 404 returned error can't find the container with id 5b8de20d89eae1b8fe90aad01ef51d46a5dd33258263201fb68efe3a8b81a6d5 Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.552760 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" event={"ID":"9e6a369e-c5b9-4911-83bc-1ea3a21a472e","Type":"ContainerStarted","Data":"5b8de20d89eae1b8fe90aad01ef51d46a5dd33258263201fb68efe3a8b81a6d5"} Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.555961 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/1.log" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.557152 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/0.log" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.562669 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6" exitCode=1 Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.562717 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6"} Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.562753 4830 scope.go:117] "RemoveContainer" containerID="3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.564191 4830 scope.go:117] "RemoveContainer" containerID="8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6" Mar 11 09:15:55 crc kubenswrapper[4830]: E0311 09:15:55.564460 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.597514 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.618947 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.633511 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.646268 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.659190 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.672599 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.688876 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.711588 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:52Z\\\",\\\"message\\\":\\\" *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 09:15:52.650637 6671 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:52.650658 6671 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:52.650661 6671 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651020 6671 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651621 6671 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 09:15:52.651666 6671 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 09:15:52.651677 6671 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 09:15:52.651689 6671 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:52.651742 6671 factory.go:656] Stopping watch factory\\\\nI0311 09:15:52.651758 6671 ovnkube.go:599] Stopped ovnkube\\\\nI0311 09:15:52.651778 6671 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:52.651788 6671 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:52.651794 6671 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.733339 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.749395 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.758656 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.769329 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.780895 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.798773 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.816608 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.954456 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zl7s2"] Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.959927 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:55 crc kubenswrapper[4830]: E0311 09:15:55.960210 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:15:55 crc kubenswrapper[4830]: I0311 09:15:55.991658 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:55Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.013110 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.020680 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.020739 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mxng\" (UniqueName: \"kubernetes.io/projected/e06af9c6-9acb-4a23-bc91-01fd25fa4915-kube-api-access-9mxng\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.032674 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.044246 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.064123 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.077386 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.090503 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.105402 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:52Z\\\",\\\"message\\\":\\\" *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 09:15:52.650637 6671 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:52.650658 6671 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:52.650661 6671 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651020 6671 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651621 6671 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 09:15:52.651666 6671 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 09:15:52.651677 6671 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 09:15:52.651689 6671 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:52.651742 6671 factory.go:656] Stopping watch factory\\\\nI0311 09:15:52.651758 6671 ovnkube.go:599] Stopped ovnkube\\\\nI0311 09:15:52.651778 6671 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:52.651788 6671 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:52.651794 6671 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.116993 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.121327 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.121392 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mxng\" (UniqueName: \"kubernetes.io/projected/e06af9c6-9acb-4a23-bc91-01fd25fa4915-kube-api-access-9mxng\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.121462 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.121510 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:56.621497808 +0000 UTC m=+124.402648497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.127686 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.137852 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.143289 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mxng\" (UniqueName: \"kubernetes.io/projected/e06af9c6-9acb-4a23-bc91-01fd25fa4915-kube-api-access-9mxng\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.147085 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.156193 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.168463 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.180830 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.191856 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.568760 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" event={"ID":"9e6a369e-c5b9-4911-83bc-1ea3a21a472e","Type":"ContainerStarted","Data":"bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995"} Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.569311 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" event={"ID":"9e6a369e-c5b9-4911-83bc-1ea3a21a472e","Type":"ContainerStarted","Data":"c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040"} Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.571395 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/1.log" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.576660 4830 scope.go:117] "RemoveContainer" containerID="8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6" Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.576847 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.591530 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.604936 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.617113 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.625933 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.627160 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.627272 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:57.627246169 +0000 UTC m=+125.408396898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.633648 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.657758 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.677657 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.696511 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.725317 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b6075d38288f31cfaa193b228c4ae96c018ea004a0f3b81c3d02d9bb1973d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:52Z\\\",\\\"message\\\":\\\" *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0311 09:15:52.650637 6671 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:52.650658 6671 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:52.650661 6671 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651020 6671 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0311 09:15:52.651621 6671 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0311 09:15:52.651666 6671 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0311 09:15:52.651677 6671 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0311 09:15:52.651689 6671 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:52.651742 6671 factory.go:656] Stopping watch factory\\\\nI0311 09:15:52.651758 6671 ovnkube.go:599] Stopped ovnkube\\\\nI0311 09:15:52.651778 6671 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:52.651788 6671 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:52.651794 6671 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.742982 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.754553 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.766870 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.792935 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.805999 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.818200 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.832552 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.843403 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.855145 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.867967 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.884234 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.906109 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.918199 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.931433 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.931523 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.931590 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.931594 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.931689 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.931799 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:56 crc kubenswrapper[4830]: E0311 09:15:56.931867 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.945392 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.950491 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.962740 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.973531 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.985652 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:56 crc kubenswrapper[4830]: I0311 09:15:56.997803 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:56Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:57 crc kubenswrapper[4830]: I0311 09:15:57.011546 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:57Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:57 crc kubenswrapper[4830]: I0311 09:15:57.031558 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:57Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:57 crc kubenswrapper[4830]: I0311 09:15:57.044541 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:57Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:57 crc kubenswrapper[4830]: I0311 09:15:57.057264 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:57Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:57 crc kubenswrapper[4830]: I0311 09:15:57.071985 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:57Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:57 crc kubenswrapper[4830]: I0311 09:15:57.636716 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:57 crc kubenswrapper[4830]: E0311 09:15:57.636977 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:57 crc kubenswrapper[4830]: E0311 09:15:57.637123 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:15:59.637093468 +0000 UTC m=+127.418244187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:57 crc kubenswrapper[4830]: I0311 09:15:57.931419 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:57 crc kubenswrapper[4830]: E0311 09:15:57.931584 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:15:58 crc kubenswrapper[4830]: E0311 09:15:58.033155 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.523617 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.538726 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.550439 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.573523 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.593233 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.604713 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.622696 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.638452 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.654660 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.669424 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.686191 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.705769 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.724269 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.737755 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.761735 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.778436 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.795223 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.809700 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:15:58Z is after 2025-08-24T17:21:41Z" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.931712 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.931842 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:15:58 crc kubenswrapper[4830]: E0311 09:15:58.931940 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:15:58 crc kubenswrapper[4830]: I0311 09:15:58.932000 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:15:58 crc kubenswrapper[4830]: E0311 09:15:58.932237 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:15:58 crc kubenswrapper[4830]: E0311 09:15:58.932353 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:15:59 crc kubenswrapper[4830]: I0311 09:15:59.657395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:59 crc kubenswrapper[4830]: E0311 09:15:59.657604 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:59 crc kubenswrapper[4830]: E0311 09:15:59.657722 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:03.657694582 +0000 UTC m=+131.438845311 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:15:59 crc kubenswrapper[4830]: I0311 09:15:59.932299 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:15:59 crc kubenswrapper[4830]: E0311 09:15:59.932539 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:00 crc kubenswrapper[4830]: I0311 09:16:00.932318 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:00 crc kubenswrapper[4830]: E0311 09:16:00.932487 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:00 crc kubenswrapper[4830]: I0311 09:16:00.932522 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:00 crc kubenswrapper[4830]: I0311 09:16:00.932322 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:00 crc kubenswrapper[4830]: E0311 09:16:00.932781 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:00 crc kubenswrapper[4830]: E0311 09:16:00.932916 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:00 crc kubenswrapper[4830]: I0311 09:16:00.946502 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.623219 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.623270 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.623282 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.623302 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.623315 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:01Z","lastTransitionTime":"2026-03-11T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:01 crc kubenswrapper[4830]: E0311 09:16:01.644390 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:01Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.649129 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.649176 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.649184 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.649198 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.649208 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:01Z","lastTransitionTime":"2026-03-11T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:01 crc kubenswrapper[4830]: E0311 09:16:01.685816 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:01Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.691968 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.692007 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.692041 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.692061 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.692072 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:01Z","lastTransitionTime":"2026-03-11T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:01 crc kubenswrapper[4830]: E0311 09:16:01.712719 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:01Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.718774 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.719130 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.719244 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.719330 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.719409 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:01Z","lastTransitionTime":"2026-03-11T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:01 crc kubenswrapper[4830]: E0311 09:16:01.731956 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:01Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.736242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.736340 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.736399 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.736457 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.736510 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:01Z","lastTransitionTime":"2026-03-11T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:01 crc kubenswrapper[4830]: E0311 09:16:01.751420 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:01Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:01 crc kubenswrapper[4830]: E0311 09:16:01.751574 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:16:01 crc kubenswrapper[4830]: I0311 09:16:01.931982 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:01 crc kubenswrapper[4830]: E0311 09:16:01.932515 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:02 crc kubenswrapper[4830]: I0311 09:16:02.932483 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:02 crc kubenswrapper[4830]: I0311 09:16:02.932513 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:02 crc kubenswrapper[4830]: E0311 09:16:02.932692 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:02 crc kubenswrapper[4830]: E0311 09:16:02.932990 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:02 crc kubenswrapper[4830]: I0311 09:16:02.934130 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:02 crc kubenswrapper[4830]: E0311 09:16:02.934223 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:02 crc kubenswrapper[4830]: I0311 09:16:02.951584 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:02Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:02 crc kubenswrapper[4830]: I0311 09:16:02.966504 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:02Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:02 crc kubenswrapper[4830]: I0311 09:16:02.980179 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:02Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:02 crc kubenswrapper[4830]: I0311 09:16:02.994602 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:02Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.009490 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.029600 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: E0311 09:16:03.033926 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.052747 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.069205 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.085097 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.101471 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.115357 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.125731 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.147962 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.161678 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.184197 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.213071 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.226948 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.241011 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:03Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.701747 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:03 crc kubenswrapper[4830]: E0311 09:16:03.701933 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:03 crc kubenswrapper[4830]: E0311 09:16:03.702059 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:11.702000024 +0000 UTC m=+139.483150743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:03 crc kubenswrapper[4830]: I0311 09:16:03.932077 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:03 crc kubenswrapper[4830]: E0311 09:16:03.932258 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:04 crc kubenswrapper[4830]: I0311 09:16:04.931659 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:04 crc kubenswrapper[4830]: E0311 09:16:04.931833 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:04 crc kubenswrapper[4830]: I0311 09:16:04.932076 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:04 crc kubenswrapper[4830]: I0311 09:16:04.932215 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:04 crc kubenswrapper[4830]: E0311 09:16:04.932349 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:04 crc kubenswrapper[4830]: E0311 09:16:04.932600 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:05 crc kubenswrapper[4830]: I0311 09:16:05.932459 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:05 crc kubenswrapper[4830]: E0311 09:16:05.932660 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:06 crc kubenswrapper[4830]: I0311 09:16:06.932206 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:06 crc kubenswrapper[4830]: E0311 09:16:06.932355 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:06 crc kubenswrapper[4830]: I0311 09:16:06.932519 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:06 crc kubenswrapper[4830]: E0311 09:16:06.932645 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:06 crc kubenswrapper[4830]: I0311 09:16:06.932671 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:06 crc kubenswrapper[4830]: E0311 09:16:06.932742 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:07 crc kubenswrapper[4830]: I0311 09:16:07.932447 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:07 crc kubenswrapper[4830]: E0311 09:16:07.933154 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:08 crc kubenswrapper[4830]: E0311 09:16:08.035626 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:08 crc kubenswrapper[4830]: I0311 09:16:08.932144 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:08 crc kubenswrapper[4830]: E0311 09:16:08.932328 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:08 crc kubenswrapper[4830]: I0311 09:16:08.932631 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:08 crc kubenswrapper[4830]: E0311 09:16:08.932756 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:08 crc kubenswrapper[4830]: I0311 09:16:08.932978 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:08 crc kubenswrapper[4830]: E0311 09:16:08.933244 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:09 crc kubenswrapper[4830]: I0311 09:16:09.931580 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:09 crc kubenswrapper[4830]: E0311 09:16:09.931778 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:09 crc kubenswrapper[4830]: I0311 09:16:09.933134 4830 scope.go:117] "RemoveContainer" containerID="8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.630135 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/1.log" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.633086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3"} Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.633561 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.654033 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.682416 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.710842 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.724112 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.733968 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.745974 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.759696 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.771309 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.787941 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.801765 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.815722 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.831745 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.844374 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.855650 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.872452 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.894003 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.908376 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.932214 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.932251 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.932214 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:10 crc kubenswrapper[4830]: I0311 09:16:10.932272 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:10Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:10 crc kubenswrapper[4830]: E0311 09:16:10.932424 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:10 crc kubenswrapper[4830]: E0311 09:16:10.932648 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:10 crc kubenswrapper[4830]: E0311 09:16:10.932793 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.639986 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/2.log" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.640978 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/1.log" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.645561 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3" exitCode=1 Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.645637 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3"} Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.645717 4830 scope.go:117] "RemoveContainer" containerID="8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.646896 4830 scope.go:117] "RemoveContainer" containerID="e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.647302 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.680289 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.695814 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.715102 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.733814 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.753864 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.789570 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.797942 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.798234 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.798357 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:27.798321993 +0000 UTC m=+155.579472722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.810723 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.831353 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.846851 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.860677 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.875456 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.883160 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.883216 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.883236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.883266 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.883288 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:11Z","lastTransitionTime":"2026-03-11T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.892906 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.898349 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.903426 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.903476 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.903495 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.903519 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.903537 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:11Z","lastTransitionTime":"2026-03-11T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.918618 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.921228 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f27e52590dbc2be9f741b481b66ae0ac777e133cdedb3a473533c1a7e45b2c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"ved *v1.Pod event handler 3\\\\nI0311 09:15:55.194588 6888 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0311 09:15:55.194600 6888 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0311 09:15:55.194609 6888 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0311 09:15:55.194613 6888 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0311 09:15:55.194618 6888 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0311 09:15:55.194619 6888 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0311 09:15:55.194626 6888 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0311 09:15:55.194635 6888 handler.go:208] Removed *v1.Node event handler 2\\\\nI0311 09:15:55.194641 6888 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0311 09:15:55.194636 6888 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0311 09:15:55.194672 6888 factory.go:656] Stopping watch factory\\\\nI0311 09:15:55.194670 6888 handler.go:208] Removed *v1.Node event handler 7\\\\nI0311 09:15:55.194684 6888 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0311 09:15:55.194796 6888 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.924785 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.924825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.924840 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.924866 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.924888 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:11Z","lastTransitionTime":"2026-03-11T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.933058 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.933385 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.934344 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.944328 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.949858 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.950462 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.950513 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.950532 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.950553 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.950565 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:11Z","lastTransitionTime":"2026-03-11T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.964392 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.969653 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.974655 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.974686 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.974702 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.974719 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.974731 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:11Z","lastTransitionTime":"2026-03-11T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.980180 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.992380 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:11 crc kubenswrapper[4830]: E0311 09:16:11.992654 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:16:11 crc kubenswrapper[4830]: I0311 09:16:11.996930 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:11Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.656431 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/2.log" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.660956 4830 scope.go:117] "RemoveContainer" containerID="e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3" Mar 11 09:16:12 crc kubenswrapper[4830]: E0311 09:16:12.661173 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.677651 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.695550 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.717627 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.741257 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.766009 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.782361 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.799353 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.816679 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.832707 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.853506 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.870248 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.884308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.898616 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.921462 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.931941 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.932129 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.932467 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:12 crc kubenswrapper[4830]: E0311 09:16:12.932584 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:12 crc kubenswrapper[4830]: E0311 09:16:12.932713 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:12 crc kubenswrapper[4830]: E0311 09:16:12.932861 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.956220 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:12 crc kubenswrapper[4830]: I0311 09:16:12.984292 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:12Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.008433 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.032877 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: E0311 09:16:13.037159 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.058436 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.083854 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.100380 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.120909 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.140338 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.159997 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.183274 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.203901 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.224618 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.263263 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.290172 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.313138 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.332954 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.356616 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.380228 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.408138 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.444309 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.464592 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:13Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:13 crc kubenswrapper[4830]: I0311 09:16:13.932377 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:13 crc kubenswrapper[4830]: E0311 09:16:13.932666 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:14 crc kubenswrapper[4830]: I0311 09:16:14.931586 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:14 crc kubenswrapper[4830]: I0311 09:16:14.931710 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:14 crc kubenswrapper[4830]: E0311 09:16:14.931826 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:14 crc kubenswrapper[4830]: I0311 09:16:14.931586 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:14 crc kubenswrapper[4830]: E0311 09:16:14.931996 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:14 crc kubenswrapper[4830]: E0311 09:16:14.932141 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:15 crc kubenswrapper[4830]: I0311 09:16:15.932298 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:15 crc kubenswrapper[4830]: E0311 09:16:15.932534 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:16 crc kubenswrapper[4830]: I0311 09:16:16.931662 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:16 crc kubenswrapper[4830]: E0311 09:16:16.931884 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:16 crc kubenswrapper[4830]: I0311 09:16:16.931673 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:16 crc kubenswrapper[4830]: I0311 09:16:16.931713 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:16 crc kubenswrapper[4830]: E0311 09:16:16.932130 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:16 crc kubenswrapper[4830]: E0311 09:16:16.932275 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:17 crc kubenswrapper[4830]: I0311 09:16:17.931725 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:17 crc kubenswrapper[4830]: E0311 09:16:17.932067 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:18 crc kubenswrapper[4830]: E0311 09:16:18.039206 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:18 crc kubenswrapper[4830]: I0311 09:16:18.931729 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:18 crc kubenswrapper[4830]: I0311 09:16:18.931794 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:18 crc kubenswrapper[4830]: E0311 09:16:18.932177 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:18 crc kubenswrapper[4830]: I0311 09:16:18.932219 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:18 crc kubenswrapper[4830]: E0311 09:16:18.932367 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:18 crc kubenswrapper[4830]: E0311 09:16:18.932555 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:19 crc kubenswrapper[4830]: I0311 09:16:19.931550 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:19 crc kubenswrapper[4830]: E0311 09:16:19.931953 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:19 crc kubenswrapper[4830]: I0311 09:16:19.947207 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 11 09:16:20 crc kubenswrapper[4830]: I0311 09:16:20.932155 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:20 crc kubenswrapper[4830]: I0311 09:16:20.932215 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:20 crc kubenswrapper[4830]: E0311 09:16:20.932346 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:20 crc kubenswrapper[4830]: I0311 09:16:20.932388 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:20 crc kubenswrapper[4830]: E0311 09:16:20.932546 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:20 crc kubenswrapper[4830]: E0311 09:16:20.932694 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:21 crc kubenswrapper[4830]: I0311 09:16:21.931761 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:21 crc kubenswrapper[4830]: E0311 09:16:21.932076 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.254972 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.255070 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.255099 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.255133 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.255157 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:22Z","lastTransitionTime":"2026-03-11T09:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.276596 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.280498 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.280559 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.280578 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.280602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.280619 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:22Z","lastTransitionTime":"2026-03-11T09:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.301147 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.305715 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.305777 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.305799 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.305828 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.305846 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:22Z","lastTransitionTime":"2026-03-11T09:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.328120 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.333360 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.333415 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.333433 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.333457 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.333476 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:22Z","lastTransitionTime":"2026-03-11T09:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.351561 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.356825 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.356870 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.356889 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.356914 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.356966 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:22Z","lastTransitionTime":"2026-03-11T09:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.376872 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.377128 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.931774 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.931863 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.931998 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.931989 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.932190 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:22 crc kubenswrapper[4830]: E0311 09:16:22.932334 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.955699 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.976141 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:22 crc kubenswrapper[4830]: I0311 09:16:22.992229 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:22Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.034012 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: E0311 09:16:23.039866 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.051860 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.067910 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.095746 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.112232 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.130839 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.146136 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.162659 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.174162 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.188265 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.200864 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.216327 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.237364 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.254843 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.273140 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.286635 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:23Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:23 crc kubenswrapper[4830]: I0311 09:16:23.932409 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:23 crc kubenswrapper[4830]: E0311 09:16:23.932973 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.854661 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.854852 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.854867 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:28.85483774 +0000 UTC m=+216.635988459 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.854929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.854981 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855007 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855076 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855097 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.855107 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855150 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:17:28.85513572 +0000 UTC m=+216.636286439 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855203 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855269 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855335 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:17:28.855306115 +0000 UTC m=+216.636456834 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855367 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:17:28.855353327 +0000 UTC m=+216.636504046 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855916 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855949 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.855995 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.856330 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:17:28.856100519 +0000 UTC m=+216.637251248 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.932430 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.932503 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:24 crc kubenswrapper[4830]: I0311 09:16:24.932443 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.932634 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.932743 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:24 crc kubenswrapper[4830]: E0311 09:16:24.932904 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:25 crc kubenswrapper[4830]: I0311 09:16:25.931845 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:25 crc kubenswrapper[4830]: E0311 09:16:25.932116 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:25 crc kubenswrapper[4830]: I0311 09:16:25.932962 4830 scope.go:117] "RemoveContainer" containerID="e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3" Mar 11 09:16:25 crc kubenswrapper[4830]: E0311 09:16:25.933420 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:16:26 crc kubenswrapper[4830]: I0311 09:16:26.931708 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:26 crc kubenswrapper[4830]: I0311 09:16:26.931788 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:26 crc kubenswrapper[4830]: E0311 09:16:26.932069 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:26 crc kubenswrapper[4830]: E0311 09:16:26.932345 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:26 crc kubenswrapper[4830]: I0311 09:16:26.932406 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:26 crc kubenswrapper[4830]: E0311 09:16:26.932666 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:27 crc kubenswrapper[4830]: I0311 09:16:27.812718 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:27 crc kubenswrapper[4830]: E0311 09:16:27.813120 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:27 crc kubenswrapper[4830]: E0311 09:16:27.813284 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:16:59.813254702 +0000 UTC m=+187.594405431 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:27 crc kubenswrapper[4830]: I0311 09:16:27.931752 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:27 crc kubenswrapper[4830]: E0311 09:16:27.931986 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:28 crc kubenswrapper[4830]: E0311 09:16:28.041689 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:28 crc kubenswrapper[4830]: I0311 09:16:28.931793 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:28 crc kubenswrapper[4830]: I0311 09:16:28.931901 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:28 crc kubenswrapper[4830]: E0311 09:16:28.931979 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:28 crc kubenswrapper[4830]: I0311 09:16:28.931994 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:28 crc kubenswrapper[4830]: E0311 09:16:28.932220 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:28 crc kubenswrapper[4830]: E0311 09:16:28.932437 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:29 crc kubenswrapper[4830]: I0311 09:16:29.931356 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:29 crc kubenswrapper[4830]: E0311 09:16:29.931562 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.743784 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/0.log" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.743869 4830 generic.go:334] "Generic (PLEG): container finished" podID="75fdb109-77cf-4d97-ac3c-6f3139b3bb7a" containerID="4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b" exitCode=1 Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.743915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerDied","Data":"4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b"} Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.744532 4830 scope.go:117] "RemoveContainer" containerID="4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.760663 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.780548 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.800131 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.823934 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.841316 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.864214 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.883827 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.905537 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.925401 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.931733 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:30 crc kubenswrapper[4830]: E0311 09:16:30.931892 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.931737 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:30 crc kubenswrapper[4830]: E0311 09:16:30.932185 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.932357 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:30 crc kubenswrapper[4830]: E0311 09:16:30.932430 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.947488 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.968684 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:30 crc kubenswrapper[4830]: I0311 09:16:30.986285 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:30Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.019947 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.037897 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.064367 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.092211 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.111168 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.127543 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.141901 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.750824 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/0.log" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.750919 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerStarted","Data":"f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223"} Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.769983 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.807323 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.826992 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.846227 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.878503 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.896308 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.913436 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.932083 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:31 crc kubenswrapper[4830]: E0311 09:16:31.932306 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.937210 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.961902 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:31 crc kubenswrapper[4830]: I0311 09:16:31.982302 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:31Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.004827 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.027860 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.047124 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.071415 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.092302 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.110819 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.132807 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.148502 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.162834 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.527795 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.527901 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.527971 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.528011 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.528092 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:32Z","lastTransitionTime":"2026-03-11T09:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.548896 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.555497 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.555567 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.555585 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.555614 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.555635 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:32Z","lastTransitionTime":"2026-03-11T09:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.576470 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.581718 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.581788 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.581809 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.581839 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.581861 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:32Z","lastTransitionTime":"2026-03-11T09:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.612363 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.618192 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.618236 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.618249 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.618272 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.618286 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:32Z","lastTransitionTime":"2026-03-11T09:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.639129 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.645493 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.645585 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.645605 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.645662 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.645683 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:32Z","lastTransitionTime":"2026-03-11T09:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.665854 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.666294 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.931748 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.931816 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.931912 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.932135 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.931737 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:32 crc kubenswrapper[4830]: E0311 09:16:32.933105 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.948906 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.968386 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:32 crc kubenswrapper[4830]: I0311 09:16:32.986682 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.000477 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:32Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.021322 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.039882 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: E0311 09:16:33.043856 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.055803 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.070370 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.087108 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.103995 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.119697 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.150514 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.171125 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.183407 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.213583 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.229364 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.247056 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.265180 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.286672 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:33Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:33 crc kubenswrapper[4830]: I0311 09:16:33.931708 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:33 crc kubenswrapper[4830]: E0311 09:16:33.932429 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:34 crc kubenswrapper[4830]: I0311 09:16:34.932138 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:34 crc kubenswrapper[4830]: I0311 09:16:34.932248 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:34 crc kubenswrapper[4830]: I0311 09:16:34.932138 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:34 crc kubenswrapper[4830]: E0311 09:16:34.932393 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:34 crc kubenswrapper[4830]: E0311 09:16:34.932514 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:34 crc kubenswrapper[4830]: E0311 09:16:34.932667 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:35 crc kubenswrapper[4830]: I0311 09:16:35.932101 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:35 crc kubenswrapper[4830]: E0311 09:16:35.932350 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:36 crc kubenswrapper[4830]: I0311 09:16:36.931688 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:36 crc kubenswrapper[4830]: I0311 09:16:36.931710 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:36 crc kubenswrapper[4830]: I0311 09:16:36.931886 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:36 crc kubenswrapper[4830]: E0311 09:16:36.931869 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:36 crc kubenswrapper[4830]: E0311 09:16:36.932118 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:36 crc kubenswrapper[4830]: E0311 09:16:36.932203 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:37 crc kubenswrapper[4830]: I0311 09:16:37.931676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:37 crc kubenswrapper[4830]: E0311 09:16:37.931977 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:38 crc kubenswrapper[4830]: E0311 09:16:38.045682 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:38 crc kubenswrapper[4830]: I0311 09:16:38.931627 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:38 crc kubenswrapper[4830]: I0311 09:16:38.931908 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:38 crc kubenswrapper[4830]: I0311 09:16:38.931943 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:38 crc kubenswrapper[4830]: E0311 09:16:38.932086 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:38 crc kubenswrapper[4830]: E0311 09:16:38.932238 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:38 crc kubenswrapper[4830]: I0311 09:16:38.932324 4830 scope.go:117] "RemoveContainer" containerID="e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3" Mar 11 09:16:38 crc kubenswrapper[4830]: E0311 09:16:38.932399 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.790822 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/2.log" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.794555 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.796086 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.808955 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.823410 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.836518 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.854558 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.881111 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.901280 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.918442 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.929616 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.931874 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:39 crc kubenswrapper[4830]: E0311 09:16:39.932092 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.941554 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.956922 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.969886 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.982396 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:39 crc kubenswrapper[4830]: I0311 09:16:39.997873 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:39Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.015143 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.032288 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.053249 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.066438 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.079372 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.093361 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.800743 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/3.log" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.801849 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/2.log" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.806587 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" exitCode=1 Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.806657 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.806706 4830 scope.go:117] "RemoveContainer" containerID="e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.807951 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:16:40 crc kubenswrapper[4830]: E0311 09:16:40.808467 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.836302 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.859065 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.880722 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.900498 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.915846 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.929609 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.931933 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:40 crc kubenswrapper[4830]: E0311 09:16:40.932080 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.932128 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:40 crc kubenswrapper[4830]: E0311 09:16:40.932182 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.932307 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:40 crc kubenswrapper[4830]: E0311 09:16:40.932357 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.950282 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.967076 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:40 crc kubenswrapper[4830]: I0311 09:16:40.980511 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.000980 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:40Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.023700 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.038723 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.068914 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e868e54bd7d9b8ef6d893900c1956021a77fb1e7fd209fc4014ef783dcda23c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:10Z\\\",\\\"message\\\":\\\"ng lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"97419c58-41c7-41d7-a137-a446f0c7eeb3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.138\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0311 09:16:10.926792 7128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:39Z\\\",\\\"message\\\":\\\"ject: *v1.Pod openshift-multus/multus-8w98l\\\\nI0311 09:16:39.771464 7445 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-8w98l in node crc\\\\nI0311 09:16:39.771468 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-8w98l after 0 failed attempt(s)\\\\nI0311 09:16:39.771473 7445 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zl7s2] creating logical port openshift-multus_network-metrics-daemon-zl7s2 for pod on switch crc\\\\nI0311 09:16:39.771470 7445 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0311 09:16:39.771450 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gtl5j after 0 failed attempt(s)\\\\nI0311 09:16:39.771485 7445 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gtl5j\\\\nF0311 09:16:39.771446 7445 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controll\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.077676 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.091206 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.110078 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.121426 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.132941 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.144170 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.816660 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/3.log" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.821472 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:16:41 crc kubenswrapper[4830]: E0311 09:16:41.821640 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.838799 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.856740 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.872078 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.881986 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.891903 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.905817 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.927532 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.932003 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:41 crc kubenswrapper[4830]: E0311 09:16:41.932248 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.948294 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.968794 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:41 crc kubenswrapper[4830]: I0311 09:16:41.986902 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:41Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.022353 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.046807 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.072122 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.091275 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.116185 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.139498 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.162723 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.194637 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:39Z\\\",\\\"message\\\":\\\"ject: *v1.Pod openshift-multus/multus-8w98l\\\\nI0311 09:16:39.771464 7445 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-8w98l in node crc\\\\nI0311 09:16:39.771468 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-8w98l after 0 failed attempt(s)\\\\nI0311 09:16:39.771473 7445 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zl7s2] creating logical port openshift-multus_network-metrics-daemon-zl7s2 for pod on switch crc\\\\nI0311 09:16:39.771470 7445 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0311 09:16:39.771450 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gtl5j after 0 failed attempt(s)\\\\nI0311 09:16:39.771485 7445 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gtl5j\\\\nF0311 09:16:39.771446 7445 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controll\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.213441 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.741556 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.741603 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.741620 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.741661 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.741680 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:42Z","lastTransitionTime":"2026-03-11T09:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.762349 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.768328 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.768392 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.768420 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.768455 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.768481 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:42Z","lastTransitionTime":"2026-03-11T09:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.788597 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.794528 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.794594 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.794614 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.794640 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.794659 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:42Z","lastTransitionTime":"2026-03-11T09:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.816853 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.823147 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.823212 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.823235 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.823259 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.823278 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:42Z","lastTransitionTime":"2026-03-11T09:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.846183 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.851821 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.851894 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.851912 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.851939 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.851957 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:42Z","lastTransitionTime":"2026-03-11T09:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.872937 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.873210 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.931991 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.932165 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.932169 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.932365 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.932635 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:42 crc kubenswrapper[4830]: E0311 09:16:42.932766 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.960011 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:42 crc kubenswrapper[4830]: I0311 09:16:42.982764 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.001886 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:42Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.020904 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.042634 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: E0311 09:16:43.047152 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.065599 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.082867 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.102217 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.124279 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.146626 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.183368 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.207600 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.229101 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.246631 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.260550 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.273862 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.288902 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.308361 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.334279 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:39Z\\\",\\\"message\\\":\\\"ject: *v1.Pod openshift-multus/multus-8w98l\\\\nI0311 09:16:39.771464 7445 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-8w98l in node crc\\\\nI0311 09:16:39.771468 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-8w98l after 0 failed attempt(s)\\\\nI0311 09:16:39.771473 7445 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zl7s2] creating logical port openshift-multus_network-metrics-daemon-zl7s2 for pod on switch crc\\\\nI0311 09:16:39.771470 7445 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0311 09:16:39.771450 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gtl5j after 0 failed attempt(s)\\\\nI0311 09:16:39.771485 7445 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gtl5j\\\\nF0311 09:16:39.771446 7445 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controll\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:43Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:43 crc kubenswrapper[4830]: I0311 09:16:43.931605 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:43 crc kubenswrapper[4830]: E0311 09:16:43.931996 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:44 crc kubenswrapper[4830]: I0311 09:16:44.931472 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:44 crc kubenswrapper[4830]: I0311 09:16:44.931546 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:44 crc kubenswrapper[4830]: E0311 09:16:44.932051 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:44 crc kubenswrapper[4830]: E0311 09:16:44.932173 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:44 crc kubenswrapper[4830]: I0311 09:16:44.931652 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:44 crc kubenswrapper[4830]: E0311 09:16:44.932280 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:45 crc kubenswrapper[4830]: I0311 09:16:45.932269 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:45 crc kubenswrapper[4830]: E0311 09:16:45.932476 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:46 crc kubenswrapper[4830]: I0311 09:16:46.932073 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:46 crc kubenswrapper[4830]: I0311 09:16:46.932108 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:46 crc kubenswrapper[4830]: I0311 09:16:46.932383 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:46 crc kubenswrapper[4830]: E0311 09:16:46.932304 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:46 crc kubenswrapper[4830]: E0311 09:16:46.932573 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:46 crc kubenswrapper[4830]: E0311 09:16:46.932684 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:47 crc kubenswrapper[4830]: I0311 09:16:47.932264 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:47 crc kubenswrapper[4830]: E0311 09:16:47.932639 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:48 crc kubenswrapper[4830]: E0311 09:16:48.047982 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:48 crc kubenswrapper[4830]: I0311 09:16:48.931846 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:48 crc kubenswrapper[4830]: I0311 09:16:48.931956 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:48 crc kubenswrapper[4830]: E0311 09:16:48.932034 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:48 crc kubenswrapper[4830]: I0311 09:16:48.932093 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:48 crc kubenswrapper[4830]: E0311 09:16:48.932247 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:48 crc kubenswrapper[4830]: E0311 09:16:48.932328 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:49 crc kubenswrapper[4830]: I0311 09:16:49.932064 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:49 crc kubenswrapper[4830]: E0311 09:16:49.932184 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:50 crc kubenswrapper[4830]: I0311 09:16:50.931720 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:50 crc kubenswrapper[4830]: E0311 09:16:50.931912 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:50 crc kubenswrapper[4830]: I0311 09:16:50.932044 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:50 crc kubenswrapper[4830]: I0311 09:16:50.932080 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:50 crc kubenswrapper[4830]: E0311 09:16:50.932245 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:50 crc kubenswrapper[4830]: E0311 09:16:50.932422 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:51 crc kubenswrapper[4830]: I0311 09:16:51.932375 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:51 crc kubenswrapper[4830]: E0311 09:16:51.932563 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:52 crc kubenswrapper[4830]: I0311 09:16:52.932198 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:52 crc kubenswrapper[4830]: I0311 09:16:52.932254 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:52 crc kubenswrapper[4830]: E0311 09:16:52.932470 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:52 crc kubenswrapper[4830]: E0311 09:16:52.933122 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:52 crc kubenswrapper[4830]: I0311 09:16:52.933534 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:52 crc kubenswrapper[4830]: E0311 09:16:52.933708 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:52 crc kubenswrapper[4830]: I0311 09:16:52.968399 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c358daac-da7d-407e-8423-8fec9b8646fa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://201b6cd7ff0b9688c0263b9d5317f3556e15c8443d75ab25e58f77d52df79618\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cda1b23dd5dd191d83a13e4a341d39bcb968e0a6cb2a80df9f52491c1e531794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc28555e1819ea260592299d74073051992a1d3c252f13dbe0cd727835fbcfb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a3581f4c16a31a99c91dd4d06b56724d61f34b1f7675dc92bb6bb0106be233f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c5271daba438ec5787e34c555c802508c68e348b7ef04615b32711eb8a87cd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0bc0fdc9f277619ab6d739cf05a567819abbb60d291f04563ef521c124d48b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1855f09d566452f85d06728635d072628384735d9a78e0831663de8f91c0efb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a2428894ff67b374709027253cbc3f82c5ea08f90f38af1d50fb0f497204594\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:52Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:52 crc kubenswrapper[4830]: I0311 09:16:52.992691 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7bc2420-5f7a-4113-84ab-58279da87f62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:15:03Z\\\",\\\"message\\\":\\\"file observer\\\\nW0311 09:15:03.632136 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0311 09:15:03.632246 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0311 09:15:03.632840 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-324060289/tls.crt::/tmp/serving-cert-324060289/tls.key\\\\\\\"\\\\nI0311 09:15:03.927433 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0311 09:15:03.931509 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0311 09:15:03.931531 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0311 09:15:03.931549 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0311 09:15:03.931556 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0311 09:15:03.935805 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0311 09:15:03.935829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935835 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0311 09:15:03.935839 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0311 09:15:03.935843 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0311 09:15:03.935845 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0311 09:15:03.935848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0311 09:15:03.935853 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0311 09:15:03.936907 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:56Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:52Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.008670 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faaab78fec942e9130cce8eca41eb1030156e5d3dfeb929456c59640d5f5c629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff2c7d4156f5951673462b1be177ca0126a719990113cd3be202f0d9779093d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.016515 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.016578 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.016602 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.016634 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.016658 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:53Z","lastTransitionTime":"2026-03-11T09:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.023445 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6a369e-c5b9-4911-83bc-1ea3a21a472e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2df532d39321e16e9931f6b272140b59bbc60b4b51e30597eec2bfb4ac48040\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf1506624754da45f2ab26b08db36be2eb76162f393840d83ed835c22cce8995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5gpvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4pd2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.033663 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.037346 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e06af9c6-9acb-4a23-bc91-01fd25fa4915\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9mxng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zl7s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.038662 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.038739 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.038759 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.039538 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.040440 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:53Z","lastTransitionTime":"2026-03-11T09:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.049305 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.054392 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94030ef6138bfc03204bf68e88478d0fb0194437c359c6573501b9213a12543d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.058488 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.062735 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.062903 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.062935 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.063102 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.063292 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:53Z","lastTransitionTime":"2026-03-11T09:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.070902 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8w98l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:29Z\\\",\\\"message\\\":\\\"2026-03-11T09:15:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c\\\\n2026-03-11T09:15:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93f785c4-7798-403d-ba30-dcc31d00815c to /host/opt/cni/bin/\\\\n2026-03-11T09:15:44Z [verbose] multus-daemon started\\\\n2026-03-11T09:15:44Z [verbose] Readiness Indicator file check\\\\n2026-03-11T09:16:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:16:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9rlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8w98l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.081680 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.086431 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.086612 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.086759 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.086899 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.086817 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-vgww4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d4bb584-ca5b-4e05-b3be-a4a3c1a92fde\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa3e37fd55ca61fff44698e076b5c7aebf9889789a02f6a10c688326aa4b1fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28e181735e1f88353211aae52f96b33dbb86289237c22d45d2a33c57a0434fd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2b066105cc4dd27cbeacece69d8fe98f4a497edca59b98722be9d1de3935c5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c287c2be99e5090202c7fa5d88be8503b4fb2b598ee89fd826e060fbf7d1a239\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eb0a9649a461e70136ab5bf1f51f9b205ba718af8555b9533075db913fa150c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d04c861c5179b3ac963d7911eecfc5bedf1e62a20cee97c3bec08bc0e6f7c893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2f3df39c44999a7b1efda596780447250179a5df8359e68dce3d935491f259f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wpnz7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-vgww4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.087074 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:53Z","lastTransitionTime":"2026-03-11T09:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.105925 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.111157 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.111568 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.111763 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.111942 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.112148 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:16:53Z","lastTransitionTime":"2026-03-11T09:16:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.113505 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-11T09:16:39Z\\\",\\\"message\\\":\\\"ject: *v1.Pod openshift-multus/multus-8w98l\\\\nI0311 09:16:39.771464 7445 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-8w98l in node crc\\\\nI0311 09:16:39.771468 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-8w98l after 0 failed attempt(s)\\\\nI0311 09:16:39.771473 7445 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-zl7s2] creating logical port openshift-multus_network-metrics-daemon-zl7s2 for pod on switch crc\\\\nI0311 09:16:39.771470 7445 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-scheduler/openshift-kube-scheduler-crc\\\\nI0311 09:16:39.771450 7445 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-gtl5j after 0 failed attempt(s)\\\\nI0311 09:16:39.771485 7445 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-gtl5j\\\\nF0311 09:16:39.771446 7445 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controll\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:16:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wqfbx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gtl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.128555 4830 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-11T09:16:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fae51179-5520-451c-a453-83531ae25f54\\\",\\\"systemUUID\\\":\\\"2fdf51c2-3f48-4868-b069-1f3b038d7ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.128898 4830 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.130062 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.147703 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.163365 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c4pts" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e816263-8213-4e07-b0e1-a5963aa3381c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://61c06d862ae8b47eecf9ad9d27f93018948fcc30f51a6f7135fd7ffad02ef87f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mgfsq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c4pts\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.176926 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-cxfbj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428c70b1-9ac3-45a8-9482-b3908f7eed8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e73f434fb8cab0ff26ab8ac678fcd17497a967754710c2033363e88e639e763\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-npsrs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-cxfbj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.201292 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.213546 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bdde2fd-3db4-4b41-9287-58960dcab5d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65cbf3998b701e45051fec1c53d7b6d5389f350e5d166d933cf4ae24b777aa5b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rbzc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:15:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p7jq8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.225995 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52afa103-dcf4-4598-bdc2-ea9e3290abb3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ded5b1c3bf6311b99669c49f1485d71611fa610647707f228fe28a292da878d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5767f03942f705c7e0ba96ae562829fb148c785c5ae418439a4111aa4f2b7de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.242850 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e6c1b37-b6bf-45c9-aed4-07c9b986c13c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ecb3b2f4652fef5405954db5ec41e9d20536dbefa2eba2584fb5e758492969e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3b76b135a05803d2616b474b9f08447be92e1415efce83fcd2d01ddad09b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d110d28cf47bd0109b6feddba6901f2677f0fb792d61495866079664e016d7f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adeb8ce516b5b43c1dbabde4d6345a4f85758b47212c4d386a2d7bd6002fc9cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.260390 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9dd9fa9-00aa-4c48-ac31-cfea7d833421\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-11T09:13:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30ef1572a29f3f159a0603eaff84ae4e507224a0c14f504e48eef1f1b4ae8894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b09ae8f54bd03621838dfdb69bee425582869e7c50c83bdd0b1eb29614d44d48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-11T09:14:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0311 09:14:25.566529 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0311 09:14:25.567575 1 observer_polling.go:159] Starting file observer\\\\nI0311 09:14:25.568583 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0311 09:14:25.569287 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0311 09:14:50.786405 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0311 09:14:53.747624 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0311 09:14:53.747692 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-11T09:14:25Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75dee0e35bad363e4f7a9e298a2e11d82cd9352b0270bfe4976856f9ca7247c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef9c40d470323d4d2ea0871e80450d5aad5edb5bddd86c087d8272ffaa7a2eb1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:13:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-11T09:13:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.281762 4830 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-11T09:15:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://565c1968a8b56e18cfef07ec3a3101ca40cfd03b1e5f7a69ac61480f86bec3fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-11T09:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-11T09:16:53Z is after 2025-08-24T17:21:41Z" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.932143 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:53 crc kubenswrapper[4830]: I0311 09:16:53.932437 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.932530 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:53 crc kubenswrapper[4830]: E0311 09:16:53.932595 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:16:54 crc kubenswrapper[4830]: I0311 09:16:54.932234 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:54 crc kubenswrapper[4830]: I0311 09:16:54.932350 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:54 crc kubenswrapper[4830]: E0311 09:16:54.932407 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:54 crc kubenswrapper[4830]: E0311 09:16:54.932533 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:54 crc kubenswrapper[4830]: I0311 09:16:54.932593 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:54 crc kubenswrapper[4830]: E0311 09:16:54.932673 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:55 crc kubenswrapper[4830]: I0311 09:16:55.932346 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:55 crc kubenswrapper[4830]: E0311 09:16:55.932540 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:56 crc kubenswrapper[4830]: I0311 09:16:56.932398 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:56 crc kubenswrapper[4830]: I0311 09:16:56.932443 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:56 crc kubenswrapper[4830]: I0311 09:16:56.932436 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:56 crc kubenswrapper[4830]: E0311 09:16:56.932527 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:56 crc kubenswrapper[4830]: E0311 09:16:56.932630 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:56 crc kubenswrapper[4830]: E0311 09:16:56.932874 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:57 crc kubenswrapper[4830]: I0311 09:16:57.932304 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:57 crc kubenswrapper[4830]: E0311 09:16:57.932512 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:16:58 crc kubenswrapper[4830]: E0311 09:16:58.051920 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:16:58 crc kubenswrapper[4830]: I0311 09:16:58.932113 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:16:58 crc kubenswrapper[4830]: I0311 09:16:58.932158 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:16:58 crc kubenswrapper[4830]: I0311 09:16:58.932142 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:16:58 crc kubenswrapper[4830]: E0311 09:16:58.932294 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:16:58 crc kubenswrapper[4830]: E0311 09:16:58.932383 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:16:58 crc kubenswrapper[4830]: E0311 09:16:58.932476 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:16:59 crc kubenswrapper[4830]: I0311 09:16:59.879628 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:59 crc kubenswrapper[4830]: E0311 09:16:59.879844 4830 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:59 crc kubenswrapper[4830]: E0311 09:16:59.879937 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs podName:e06af9c6-9acb-4a23-bc91-01fd25fa4915 nodeName:}" failed. No retries permitted until 2026-03-11 09:18:03.879909882 +0000 UTC m=+251.661060601 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs") pod "network-metrics-daemon-zl7s2" (UID: "e06af9c6-9acb-4a23-bc91-01fd25fa4915") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 11 09:16:59 crc kubenswrapper[4830]: I0311 09:16:59.932417 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:16:59 crc kubenswrapper[4830]: E0311 09:16:59.932632 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:00 crc kubenswrapper[4830]: I0311 09:17:00.931754 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:00 crc kubenswrapper[4830]: I0311 09:17:00.931851 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:00 crc kubenswrapper[4830]: E0311 09:17:00.932129 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:00 crc kubenswrapper[4830]: I0311 09:17:00.932243 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:00 crc kubenswrapper[4830]: E0311 09:17:00.932454 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:00 crc kubenswrapper[4830]: E0311 09:17:00.932557 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:01 crc kubenswrapper[4830]: I0311 09:17:01.931685 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:01 crc kubenswrapper[4830]: E0311 09:17:01.931880 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:02 crc kubenswrapper[4830]: I0311 09:17:02.931999 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:02 crc kubenswrapper[4830]: I0311 09:17:02.932098 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:02 crc kubenswrapper[4830]: E0311 09:17:02.932363 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:02 crc kubenswrapper[4830]: I0311 09:17:02.932391 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:02 crc kubenswrapper[4830]: E0311 09:17:02.932526 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:02 crc kubenswrapper[4830]: E0311 09:17:02.932637 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:02 crc kubenswrapper[4830]: I0311 09:17:02.981556 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8w98l" podStartSLOduration=120.981530019 podStartE2EDuration="2m0.981530019s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:02.981184508 +0000 UTC m=+190.762335257" watchObservedRunningTime="2026-03-11 09:17:02.981530019 +0000 UTC m=+190.762680748" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.016119 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vgww4" podStartSLOduration=121.016091418 podStartE2EDuration="2m1.016091418s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.015071317 +0000 UTC m=+190.796222066" watchObservedRunningTime="2026-03-11 09:17:03.016091418 +0000 UTC m=+190.797242147" Mar 11 09:17:03 crc kubenswrapper[4830]: E0311 09:17:03.052897 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.156864 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c4pts" podStartSLOduration=121.156829255 podStartE2EDuration="2m1.156829255s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.155633668 +0000 UTC m=+190.936784367" watchObservedRunningTime="2026-03-11 09:17:03.156829255 +0000 UTC m=+190.937979984" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.193388 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cxfbj" podStartSLOduration=121.193341816 podStartE2EDuration="2m1.193341816s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.171015169 +0000 UTC m=+190.952165868" watchObservedRunningTime="2026-03-11 09:17:03.193341816 +0000 UTC m=+190.974492545" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.221348 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=63.2213275 podStartE2EDuration="1m3.2213275s" podCreationTimestamp="2026-03-11 09:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.193566983 +0000 UTC m=+190.974717662" watchObservedRunningTime="2026-03-11 09:17:03.2213275 +0000 UTC m=+191.002478219" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.241065 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=67.241016206 podStartE2EDuration="1m7.241016206s" podCreationTimestamp="2026-03-11 09:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.220550576 +0000 UTC m=+191.001701275" watchObservedRunningTime="2026-03-11 09:17:03.241016206 +0000 UTC m=+191.022166915" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.241846 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=44.241837272 podStartE2EDuration="44.241837272s" podCreationTimestamp="2026-03-11 09:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.24083975 +0000 UTC m=+191.021990459" watchObservedRunningTime="2026-03-11 09:17:03.241837272 +0000 UTC m=+191.022987991" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.343911 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=101.34389202 podStartE2EDuration="1m41.34389202s" podCreationTimestamp="2026-03-11 09:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.343314512 +0000 UTC m=+191.124465231" watchObservedRunningTime="2026-03-11 09:17:03.34389202 +0000 UTC m=+191.125042709" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.344210 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podStartSLOduration=121.34420272 podStartE2EDuration="2m1.34420272s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.301136534 +0000 UTC m=+191.082287233" watchObservedRunningTime="2026-03-11 09:17:03.34420272 +0000 UTC m=+191.125353409" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.362931 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=102.362911164 podStartE2EDuration="1m42.362911164s" podCreationTimestamp="2026-03-11 09:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.3627813 +0000 UTC m=+191.143931999" watchObservedRunningTime="2026-03-11 09:17:03.362911164 +0000 UTC m=+191.144061853" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.467183 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.467230 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.467242 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.467258 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.467269 4830 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-11T09:17:03Z","lastTransitionTime":"2026-03-11T09:17:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.525317 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4pd2q" podStartSLOduration=120.525292218 podStartE2EDuration="2m0.525292218s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:03.405733232 +0000 UTC m=+191.186883961" watchObservedRunningTime="2026-03-11 09:17:03.525292218 +0000 UTC m=+191.306442927" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.525954 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf"] Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.526778 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.529350 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.529740 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.530183 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.530419 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.620757 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b53e5d9-f7be-4704-af8e-157d5ca11102-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.620817 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b53e5d9-f7be-4704-af8e-157d5ca11102-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.620863 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b53e5d9-f7be-4704-af8e-157d5ca11102-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.620961 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1b53e5d9-f7be-4704-af8e-157d5ca11102-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.620998 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1b53e5d9-f7be-4704-af8e-157d5ca11102-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.722811 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1b53e5d9-f7be-4704-af8e-157d5ca11102-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.722701 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1b53e5d9-f7be-4704-af8e-157d5ca11102-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.723520 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1b53e5d9-f7be-4704-af8e-157d5ca11102-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.723656 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b53e5d9-f7be-4704-af8e-157d5ca11102-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.723698 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b53e5d9-f7be-4704-af8e-157d5ca11102-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.723746 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b53e5d9-f7be-4704-af8e-157d5ca11102-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.723841 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1b53e5d9-f7be-4704-af8e-157d5ca11102-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.725387 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b53e5d9-f7be-4704-af8e-157d5ca11102-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.736197 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b53e5d9-f7be-4704-af8e-157d5ca11102-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.748126 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b53e5d9-f7be-4704-af8e-157d5ca11102-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rrrvf\" (UID: \"1b53e5d9-f7be-4704-af8e-157d5ca11102\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.844485 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" Mar 11 09:17:03 crc kubenswrapper[4830]: W0311 09:17:03.869052 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b53e5d9_f7be_4704_af8e_157d5ca11102.slice/crio-5f914875287f63533a8607d28486e153b6e899334fc7d2bb03d354e5ac9e813c WatchSource:0}: Error finding container 5f914875287f63533a8607d28486e153b6e899334fc7d2bb03d354e5ac9e813c: Status 404 returned error can't find the container with id 5f914875287f63533a8607d28486e153b6e899334fc7d2bb03d354e5ac9e813c Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.897495 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" event={"ID":"1b53e5d9-f7be-4704-af8e-157d5ca11102","Type":"ContainerStarted","Data":"5f914875287f63533a8607d28486e153b6e899334fc7d2bb03d354e5ac9e813c"} Mar 11 09:17:03 crc kubenswrapper[4830]: I0311 09:17:03.931971 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:03 crc kubenswrapper[4830]: E0311 09:17:03.932868 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:04 crc kubenswrapper[4830]: I0311 09:17:04.055758 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 11 09:17:04 crc kubenswrapper[4830]: I0311 09:17:04.067615 4830 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 11 09:17:04 crc kubenswrapper[4830]: I0311 09:17:04.902130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" event={"ID":"1b53e5d9-f7be-4704-af8e-157d5ca11102","Type":"ContainerStarted","Data":"9f5a615bc69eb19b020a658bf0c894b13461e3c8259c5e8022706187aefc3e3a"} Mar 11 09:17:04 crc kubenswrapper[4830]: I0311 09:17:04.921246 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rrrvf" podStartSLOduration=122.921219462 podStartE2EDuration="2m2.921219462s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:04.921073327 +0000 UTC m=+192.702224086" watchObservedRunningTime="2026-03-11 09:17:04.921219462 +0000 UTC m=+192.702370231" Mar 11 09:17:04 crc kubenswrapper[4830]: I0311 09:17:04.932524 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:04 crc kubenswrapper[4830]: I0311 09:17:04.932744 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:04 crc kubenswrapper[4830]: E0311 09:17:04.932961 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:04 crc kubenswrapper[4830]: E0311 09:17:04.933209 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:04 crc kubenswrapper[4830]: I0311 09:17:04.933523 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:04 crc kubenswrapper[4830]: E0311 09:17:04.933667 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:05 crc kubenswrapper[4830]: I0311 09:17:05.931787 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:05 crc kubenswrapper[4830]: E0311 09:17:05.931941 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:06 crc kubenswrapper[4830]: I0311 09:17:06.932152 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:06 crc kubenswrapper[4830]: I0311 09:17:06.932189 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:06 crc kubenswrapper[4830]: I0311 09:17:06.932261 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:06 crc kubenswrapper[4830]: E0311 09:17:06.932365 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:06 crc kubenswrapper[4830]: E0311 09:17:06.932537 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:06 crc kubenswrapper[4830]: E0311 09:17:06.932771 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:07 crc kubenswrapper[4830]: I0311 09:17:07.932051 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:17:07 crc kubenswrapper[4830]: E0311 09:17:07.932203 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gtl5j_openshift-ovn-kubernetes(13b9ac6c-3f4b-4dd4-b91d-7173880939d8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" Mar 11 09:17:07 crc kubenswrapper[4830]: I0311 09:17:07.932343 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:07 crc kubenswrapper[4830]: E0311 09:17:07.932390 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:08 crc kubenswrapper[4830]: E0311 09:17:08.054370 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:17:08 crc kubenswrapper[4830]: I0311 09:17:08.931859 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:08 crc kubenswrapper[4830]: I0311 09:17:08.931920 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:08 crc kubenswrapper[4830]: I0311 09:17:08.931944 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:08 crc kubenswrapper[4830]: E0311 09:17:08.932058 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:08 crc kubenswrapper[4830]: E0311 09:17:08.932166 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:08 crc kubenswrapper[4830]: E0311 09:17:08.932324 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:09 crc kubenswrapper[4830]: I0311 09:17:09.932352 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:09 crc kubenswrapper[4830]: E0311 09:17:09.932577 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:10 crc kubenswrapper[4830]: I0311 09:17:10.931437 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:10 crc kubenswrapper[4830]: I0311 09:17:10.931553 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:10 crc kubenswrapper[4830]: I0311 09:17:10.931582 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:10 crc kubenswrapper[4830]: E0311 09:17:10.931795 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:10 crc kubenswrapper[4830]: E0311 09:17:10.931972 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:10 crc kubenswrapper[4830]: E0311 09:17:10.932110 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:11 crc kubenswrapper[4830]: I0311 09:17:11.932323 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:11 crc kubenswrapper[4830]: E0311 09:17:11.932508 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:12 crc kubenswrapper[4830]: I0311 09:17:12.931423 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:12 crc kubenswrapper[4830]: I0311 09:17:12.931478 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:12 crc kubenswrapper[4830]: E0311 09:17:12.933686 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:12 crc kubenswrapper[4830]: I0311 09:17:12.933756 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:12 crc kubenswrapper[4830]: E0311 09:17:12.933995 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:12 crc kubenswrapper[4830]: E0311 09:17:12.934079 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:13 crc kubenswrapper[4830]: E0311 09:17:13.055002 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:17:13 crc kubenswrapper[4830]: I0311 09:17:13.932348 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:13 crc kubenswrapper[4830]: E0311 09:17:13.932511 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:14 crc kubenswrapper[4830]: I0311 09:17:14.932217 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:14 crc kubenswrapper[4830]: I0311 09:17:14.932291 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:14 crc kubenswrapper[4830]: E0311 09:17:14.932456 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:14 crc kubenswrapper[4830]: E0311 09:17:14.932584 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:14 crc kubenswrapper[4830]: I0311 09:17:14.932839 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:14 crc kubenswrapper[4830]: E0311 09:17:14.933194 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:15 crc kubenswrapper[4830]: I0311 09:17:15.931572 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:15 crc kubenswrapper[4830]: E0311 09:17:15.932171 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.932373 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.932395 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:16 crc kubenswrapper[4830]: E0311 09:17:16.933352 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:16 crc kubenswrapper[4830]: E0311 09:17:16.933456 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.932465 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:16 crc kubenswrapper[4830]: E0311 09:17:16.933578 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.946737 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/1.log" Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.947649 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/0.log" Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.947723 4830 generic.go:334] "Generic (PLEG): container finished" podID="75fdb109-77cf-4d97-ac3c-6f3139b3bb7a" containerID="f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223" exitCode=1 Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.947761 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerDied","Data":"f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223"} Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.947803 4830 scope.go:117] "RemoveContainer" containerID="4e66eb3dca99f8cdd4970fe4608699fd7b7911d52b8462b8d0b24c32660fe85b" Mar 11 09:17:16 crc kubenswrapper[4830]: I0311 09:17:16.948353 4830 scope.go:117] "RemoveContainer" containerID="f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223" Mar 11 09:17:16 crc kubenswrapper[4830]: E0311 09:17:16.948605 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8w98l_openshift-multus(75fdb109-77cf-4d97-ac3c-6f3139b3bb7a)\"" pod="openshift-multus/multus-8w98l" podUID="75fdb109-77cf-4d97-ac3c-6f3139b3bb7a" Mar 11 09:17:17 crc kubenswrapper[4830]: I0311 09:17:17.932291 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:17 crc kubenswrapper[4830]: E0311 09:17:17.932893 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:17 crc kubenswrapper[4830]: I0311 09:17:17.953776 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/1.log" Mar 11 09:17:18 crc kubenswrapper[4830]: E0311 09:17:18.056580 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:17:18 crc kubenswrapper[4830]: I0311 09:17:18.932001 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:18 crc kubenswrapper[4830]: I0311 09:17:18.932133 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:18 crc kubenswrapper[4830]: E0311 09:17:18.932245 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:18 crc kubenswrapper[4830]: I0311 09:17:18.932043 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:18 crc kubenswrapper[4830]: E0311 09:17:18.932383 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:18 crc kubenswrapper[4830]: E0311 09:17:18.932546 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:19 crc kubenswrapper[4830]: I0311 09:17:19.931511 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:19 crc kubenswrapper[4830]: E0311 09:17:19.931867 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:20 crc kubenswrapper[4830]: I0311 09:17:20.931873 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:20 crc kubenswrapper[4830]: I0311 09:17:20.931895 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:20 crc kubenswrapper[4830]: I0311 09:17:20.932069 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:20 crc kubenswrapper[4830]: E0311 09:17:20.932214 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:20 crc kubenswrapper[4830]: E0311 09:17:20.932653 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:20 crc kubenswrapper[4830]: E0311 09:17:20.932774 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:20 crc kubenswrapper[4830]: I0311 09:17:20.933088 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:17:21 crc kubenswrapper[4830]: I0311 09:17:21.847486 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zl7s2"] Mar 11 09:17:21 crc kubenswrapper[4830]: I0311 09:17:21.847604 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:21 crc kubenswrapper[4830]: E0311 09:17:21.847710 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:21 crc kubenswrapper[4830]: I0311 09:17:21.968284 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/3.log" Mar 11 09:17:21 crc kubenswrapper[4830]: I0311 09:17:21.970614 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerStarted","Data":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} Mar 11 09:17:21 crc kubenswrapper[4830]: I0311 09:17:21.971040 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:17:22 crc kubenswrapper[4830]: I0311 09:17:22.005706 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podStartSLOduration=140.005679986 podStartE2EDuration="2m20.005679986s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:22.004397159 +0000 UTC m=+209.785547878" watchObservedRunningTime="2026-03-11 09:17:22.005679986 +0000 UTC m=+209.786830715" Mar 11 09:17:22 crc kubenswrapper[4830]: I0311 09:17:22.932243 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:22 crc kubenswrapper[4830]: I0311 09:17:22.932303 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:22 crc kubenswrapper[4830]: I0311 09:17:22.932374 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:22 crc kubenswrapper[4830]: I0311 09:17:22.932423 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:22 crc kubenswrapper[4830]: E0311 09:17:22.933946 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:22 crc kubenswrapper[4830]: E0311 09:17:22.934118 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:22 crc kubenswrapper[4830]: E0311 09:17:22.934209 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:22 crc kubenswrapper[4830]: E0311 09:17:22.934315 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:23 crc kubenswrapper[4830]: E0311 09:17:23.067671 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:17:24 crc kubenswrapper[4830]: I0311 09:17:24.932070 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:24 crc kubenswrapper[4830]: E0311 09:17:24.932196 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:24 crc kubenswrapper[4830]: I0311 09:17:24.932283 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:24 crc kubenswrapper[4830]: I0311 09:17:24.932311 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:24 crc kubenswrapper[4830]: I0311 09:17:24.932357 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:24 crc kubenswrapper[4830]: E0311 09:17:24.932432 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:24 crc kubenswrapper[4830]: E0311 09:17:24.932373 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:24 crc kubenswrapper[4830]: E0311 09:17:24.932590 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:26 crc kubenswrapper[4830]: I0311 09:17:26.933216 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:26 crc kubenswrapper[4830]: I0311 09:17:26.933354 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:26 crc kubenswrapper[4830]: E0311 09:17:26.933438 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:26 crc kubenswrapper[4830]: E0311 09:17:26.933559 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:26 crc kubenswrapper[4830]: I0311 09:17:26.933643 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:26 crc kubenswrapper[4830]: E0311 09:17:26.933769 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:26 crc kubenswrapper[4830]: I0311 09:17:26.933906 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:26 crc kubenswrapper[4830]: E0311 09:17:26.933999 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.069665 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.928670 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.928792 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.928893 4830 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.928917 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:19:30.928856246 +0000 UTC m=+338.710006975 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.928969 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:30.928955449 +0000 UTC m=+338.710106178 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.929219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929331 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929346 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.929347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929356 4830 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.929436 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929488 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:30.929474564 +0000 UTC m=+338.710625253 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929381 4830 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929695 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:30.9296806 +0000 UTC m=+338.710831329 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929734 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929764 4830 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929785 4830 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.929856 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:30.929834444 +0000 UTC m=+338.710985173 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.931814 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.931818 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.932001 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.932085 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.931956 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.932184 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:28 crc kubenswrapper[4830]: I0311 09:17:28.932273 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:28 crc kubenswrapper[4830]: E0311 09:17:28.932476 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:30 crc kubenswrapper[4830]: I0311 09:17:30.932311 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:30 crc kubenswrapper[4830]: E0311 09:17:30.932457 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:30 crc kubenswrapper[4830]: I0311 09:17:30.932565 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:30 crc kubenswrapper[4830]: E0311 09:17:30.932753 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:30 crc kubenswrapper[4830]: I0311 09:17:30.932851 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:30 crc kubenswrapper[4830]: E0311 09:17:30.932935 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:30 crc kubenswrapper[4830]: I0311 09:17:30.932999 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:30 crc kubenswrapper[4830]: E0311 09:17:30.933098 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:31 crc kubenswrapper[4830]: I0311 09:17:31.933150 4830 scope.go:117] "RemoveContainer" containerID="f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223" Mar 11 09:17:32 crc kubenswrapper[4830]: I0311 09:17:32.932253 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:32 crc kubenswrapper[4830]: E0311 09:17:32.934453 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:32 crc kubenswrapper[4830]: I0311 09:17:32.934495 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:32 crc kubenswrapper[4830]: I0311 09:17:32.934552 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:32 crc kubenswrapper[4830]: E0311 09:17:32.934668 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:32 crc kubenswrapper[4830]: I0311 09:17:32.934530 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:32 crc kubenswrapper[4830]: E0311 09:17:32.934828 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:32 crc kubenswrapper[4830]: E0311 09:17:32.934923 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:33 crc kubenswrapper[4830]: I0311 09:17:33.010306 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/1.log" Mar 11 09:17:33 crc kubenswrapper[4830]: I0311 09:17:33.010385 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerStarted","Data":"81a0883de2371cc2d74c6c6fe6667f4a6bc5042033ad0fe8911ac16bd14906ac"} Mar 11 09:17:33 crc kubenswrapper[4830]: E0311 09:17:33.070664 4830 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 11 09:17:34 crc kubenswrapper[4830]: I0311 09:17:34.931978 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:34 crc kubenswrapper[4830]: I0311 09:17:34.932057 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:34 crc kubenswrapper[4830]: I0311 09:17:34.932114 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:34 crc kubenswrapper[4830]: E0311 09:17:34.932220 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:34 crc kubenswrapper[4830]: I0311 09:17:34.932295 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:34 crc kubenswrapper[4830]: E0311 09:17:34.932371 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:34 crc kubenswrapper[4830]: E0311 09:17:34.932557 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:34 crc kubenswrapper[4830]: E0311 09:17:34.932699 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:36 crc kubenswrapper[4830]: I0311 09:17:36.931455 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:36 crc kubenswrapper[4830]: I0311 09:17:36.931574 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:36 crc kubenswrapper[4830]: E0311 09:17:36.931627 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 11 09:17:36 crc kubenswrapper[4830]: I0311 09:17:36.931661 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:36 crc kubenswrapper[4830]: I0311 09:17:36.931699 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:36 crc kubenswrapper[4830]: E0311 09:17:36.931891 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zl7s2" podUID="e06af9c6-9acb-4a23-bc91-01fd25fa4915" Mar 11 09:17:36 crc kubenswrapper[4830]: E0311 09:17:36.932065 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 11 09:17:36 crc kubenswrapper[4830]: E0311 09:17:36.932202 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.931504 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.932451 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.932518 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.932720 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.935607 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.935642 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.936183 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.936820 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.937181 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 09:17:38 crc kubenswrapper[4830]: I0311 09:17:38.944617 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 09:17:43 crc kubenswrapper[4830]: I0311 09:17:43.060314 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:17:43 crc kubenswrapper[4830]: I0311 09:17:43.060398 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:17:43 crc kubenswrapper[4830]: I0311 09:17:43.480239 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.346871 4830 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.452515 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-r4nnq"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.453299 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.456733 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.457143 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.457541 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.459445 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.460013 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.460470 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.461647 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.463741 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.464652 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.464886 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.465385 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6dtx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.465926 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.466564 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.469767 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.470679 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.471788 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.472305 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.472359 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.472587 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.477534 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.478747 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.479366 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.479429 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.479668 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.481879 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.481883 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.481917 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.481955 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.481984 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.481981 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.482046 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.482070 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.482240 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.482431 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.484041 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.484108 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.485557 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.492878 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.493089 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.494854 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s8cnh"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.495325 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.495655 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.504933 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.505278 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.505909 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.506186 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.507680 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xqggt"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.508501 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.508705 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-r4nnq"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.509011 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.509976 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k86x2"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.523627 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.523837 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524055 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524132 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524175 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524297 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524399 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524503 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524610 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jf97s"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.524733 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.526982 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.533770 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.534011 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.534161 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.534712 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.534866 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.535305 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.535736 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-cprgf"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536050 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536350 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536695 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536720 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536743 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cprgf" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536764 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536694 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536826 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.536923 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.537093 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.537290 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.537352 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-75j46"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.537394 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.537355 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.537989 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f87sw"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.538318 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.538615 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-426g8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.542301 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.542664 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.543230 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.543311 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.551408 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.553587 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.553829 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.554232 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.555394 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jr5k9"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.555730 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.555977 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.556202 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.556432 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.557385 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.557407 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hnjgz"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.558535 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.558611 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.558798 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.559285 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.560658 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.560940 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.561403 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.561553 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.561828 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562102 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.563221 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562200 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562210 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562241 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562277 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562295 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562311 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562550 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562558 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.580000 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562632 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562708 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.562956 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.563065 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.585808 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.587440 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pplbq"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.588525 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.591397 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.591638 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.592081 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.592903 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.594145 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.594268 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.594319 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.594728 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.594789 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.595481 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.595761 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.595890 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.596032 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.596181 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.596492 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.596996 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.612260 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.612327 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.614266 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.616150 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.616274 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.616452 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.616963 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.617415 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.617654 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.617768 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.624072 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qn6h2"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.624798 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.625281 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6dtx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.626439 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dpw5"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.626984 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.627474 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.627893 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.629759 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.630112 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.630530 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.630640 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.636895 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-j29w2"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.637418 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sbhg5"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.637565 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.638102 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.638262 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.638457 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.638688 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.641080 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.641765 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.641940 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.642392 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.643397 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-k476p"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.643740 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644079 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-etcd-client\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644107 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw84n\" (UniqueName: \"kubernetes.io/projected/3569f4d9-01f3-47be-bc27-af6a7cb999a1-kube-api-access-pw84n\") pod \"downloads-7954f5f757-cprgf\" (UID: \"3569f4d9-01f3-47be-bc27-af6a7cb999a1\") " pod="openshift-console/downloads-7954f5f757-cprgf" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644086 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644083 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644154 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644242 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644263 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb56b\" (UniqueName: \"kubernetes.io/projected/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-kube-api-access-xb56b\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644282 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pc42\" (UniqueName: \"kubernetes.io/projected/61f0fae5-d6d9-4f7b-b163-72316a316a37-kube-api-access-4pc42\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644301 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8377f033-b2c4-426e-bf41-3de032a37373-serving-cert\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644320 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a7f311-e393-45f7-9b05-28590202d310-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644337 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644358 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644376 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88631f5e-5bc9-450f-a46f-b81b6514afdd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644406 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644423 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8377f033-b2c4-426e-bf41-3de032a37373-etcd-client\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644443 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-policies\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644459 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd48e28d-bcdd-4bba-a540-0213cda9599a-config\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644476 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-serving-cert\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644494 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-service-ca\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644509 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0584184a-1f77-477d-8988-df9f60c5b194-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644546 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-auth-proxy-config\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644574 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a7f311-e393-45f7-9b05-28590202d310-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644596 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-audit-dir\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644613 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553676-chghx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644616 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-config\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644727 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-etcd-service-ca\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644746 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88631f5e-5bc9-450f-a46f-b81b6514afdd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644763 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-audit-policies\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644793 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a7f311-e393-45f7-9b05-28590202d310-config\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644840 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-dir\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-etcd-serving-ca\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0a49527e-c963-4f2c-8e8c-5f2a879ac281-node-pullsecrets\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644901 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644920 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61f0fae5-d6d9-4f7b-b163-72316a316a37-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644940 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644958 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.644963 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553676-chghx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645026 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645073 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg4bn\" (UniqueName: \"kubernetes.io/projected/ea88a306-e701-4a49-b4d2-7c4b62372c06-kube-api-access-bg4bn\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645097 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg8w8\" (UniqueName: \"kubernetes.io/projected/fd48e28d-bcdd-4bba-a540-0213cda9599a-kube-api-access-dg8w8\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645116 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0584184a-1f77-477d-8988-df9f60c5b194-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645136 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645155 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-audit\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645174 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-etcd-client\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645193 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-encryption-config\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645209 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5b7c\" (UniqueName: \"kubernetes.io/projected/8377f033-b2c4-426e-bf41-3de032a37373-kube-api-access-d5b7c\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645230 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-trusted-ca-bundle\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645283 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nk42\" (UniqueName: \"kubernetes.io/projected/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-kube-api-access-8nk42\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645318 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-config\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645337 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645354 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-config\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc9g\" (UniqueName: \"kubernetes.io/projected/cd450036-5201-4553-a9de-c08a7a9c9f52-kube-api-access-4kc9g\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645388 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-serving-cert\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645417 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qdp\" (UniqueName: \"kubernetes.io/projected/88631f5e-5bc9-450f-a46f-b81b6514afdd-kube-api-access-l8qdp\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645443 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-client-ca\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645477 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645493 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-serving-cert\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645507 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0fae5-d6d9-4f7b-b163-72316a316a37-serving-cert\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645526 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-trusted-ca-bundle\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-machine-approver-tls\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-serving-cert\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645594 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2f08a-36b5-487c-8222-f303e055755f-metrics-tls\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645611 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-etcd-ca\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645629 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6xt\" (UniqueName: \"kubernetes.io/projected/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-kube-api-access-lq6xt\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645648 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjprf\" (UniqueName: \"kubernetes.io/projected/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-kube-api-access-qjprf\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645664 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-service-ca-bundle\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645686 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-oauth-serving-cert\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645704 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58pcq\" (UniqueName: \"kubernetes.io/projected/87e5c483-a4bb-46b6-add5-1d1638cbbbab-kube-api-access-58pcq\") pod \"cluster-samples-operator-665b6dd947-hhjh9\" (UID: \"87e5c483-a4bb-46b6-add5-1d1638cbbbab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645725 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-serving-cert\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645743 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd48e28d-bcdd-4bba-a540-0213cda9599a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645761 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-console-config\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645776 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0584184a-1f77-477d-8988-df9f60c5b194-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645793 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645811 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645828 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645845 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/87e5c483-a4bb-46b6-add5-1d1638cbbbab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hhjh9\" (UID: \"87e5c483-a4bb-46b6-add5-1d1638cbbbab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645794 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645893 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwjrq\" (UniqueName: \"kubernetes.io/projected/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-kube-api-access-cwjrq\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645930 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsqw8\" (UniqueName: \"kubernetes.io/projected/81a2f08a-36b5-487c-8222-f303e055755f-kube-api-access-dsqw8\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.645962 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-config\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646006 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a49527e-c963-4f2c-8e8c-5f2a879ac281-audit-dir\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646040 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh5sc\" (UniqueName: \"kubernetes.io/projected/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-kube-api-access-gh5sc\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646056 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646100 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-encryption-config\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646121 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8sf8\" (UniqueName: \"kubernetes.io/projected/0a49527e-c963-4f2c-8e8c-5f2a879ac281-kube-api-access-g8sf8\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646136 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fd48e28d-bcdd-4bba-a540-0213cda9599a-images\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646157 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a2f08a-36b5-487c-8222-f303e055755f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646184 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81a2f08a-36b5-487c-8222-f303e055755f-trusted-ca\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-image-import-ca\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646242 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-oauth-config\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646256 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-config\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646275 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646292 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646309 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8k7\" (UniqueName: \"kubernetes.io/projected/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-kube-api-access-4s8k7\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.646538 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.647114 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.647895 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nfnzk"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.648375 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.648565 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.649129 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.649398 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xqggt"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.650310 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s8cnh"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.651242 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.652171 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cprgf"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.653949 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.656794 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.658238 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f87sw"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.658361 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.660317 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jf97s"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.661465 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dpw5"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.662975 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-75j46"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.665387 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jr5k9"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.671702 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.675171 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qn6h2"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.675631 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.677556 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hnjgz"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.678940 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.679929 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.680974 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-426g8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.682088 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lhp5r"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.682905 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.683189 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.684324 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.685335 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553676-chghx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.686367 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.687489 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.688512 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k476p"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.689578 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rj9w8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.690807 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.692074 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.694735 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pplbq"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.695161 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.696437 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nfnzk"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.697942 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.699633 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.701206 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rj9w8"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.702662 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.704237 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.705694 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.707404 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k86x2"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.708991 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.710453 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sbhg5"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.712085 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk"] Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.723340 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.735686 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.746929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-trusted-ca-bundle\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.746962 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.746982 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nk42\" (UniqueName: \"kubernetes.io/projected/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-kube-api-access-8nk42\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747004 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747061 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-config\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747078 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-config\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747094 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc9g\" (UniqueName: \"kubernetes.io/projected/cd450036-5201-4553-a9de-c08a7a9c9f52-kube-api-access-4kc9g\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747111 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-serving-cert\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747131 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cwk2\" (UniqueName: \"kubernetes.io/projected/4704aedc-31a3-4890-a1f9-1fc6533caae0-kube-api-access-2cwk2\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qdp\" (UniqueName: \"kubernetes.io/projected/88631f5e-5bc9-450f-a46f-b81b6514afdd-kube-api-access-l8qdp\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747171 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747187 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747204 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-serving-cert\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747221 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0fae5-d6d9-4f7b-b163-72316a316a37-serving-cert\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747241 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-client-ca\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747260 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-trusted-ca-bundle\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747303 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-machine-approver-tls\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747320 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-etcd-ca\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6xt\" (UniqueName: \"kubernetes.io/projected/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-kube-api-access-lq6xt\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747354 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747380 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-serving-cert\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747404 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2f08a-36b5-487c-8222-f303e055755f-metrics-tls\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747421 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjprf\" (UniqueName: \"kubernetes.io/projected/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-kube-api-access-qjprf\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747444 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-service-ca-bundle\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-oauth-serving-cert\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747484 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58pcq\" (UniqueName: \"kubernetes.io/projected/87e5c483-a4bb-46b6-add5-1d1638cbbbab-kube-api-access-58pcq\") pod \"cluster-samples-operator-665b6dd947-hhjh9\" (UID: \"87e5c483-a4bb-46b6-add5-1d1638cbbbab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747500 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-console-config\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747517 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0584184a-1f77-477d-8988-df9f60c5b194-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747533 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-serving-cert\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747550 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd48e28d-bcdd-4bba-a540-0213cda9599a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747571 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747587 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747604 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/87e5c483-a4bb-46b6-add5-1d1638cbbbab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hhjh9\" (UID: \"87e5c483-a4bb-46b6-add5-1d1638cbbbab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747620 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwjrq\" (UniqueName: \"kubernetes.io/projected/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-kube-api-access-cwjrq\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747635 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsqw8\" (UniqueName: \"kubernetes.io/projected/81a2f08a-36b5-487c-8222-f303e055755f-kube-api-access-dsqw8\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747651 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747667 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-config\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747687 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx2vk\" (UniqueName: \"kubernetes.io/projected/c61d535e-afb5-4006-a758-8bba8735a860-kube-api-access-fx2vk\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747716 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a49527e-c963-4f2c-8e8c-5f2a879ac281-audit-dir\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747733 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh5sc\" (UniqueName: \"kubernetes.io/projected/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-kube-api-access-gh5sc\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747757 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-config\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747769 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-encryption-config\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8sf8\" (UniqueName: \"kubernetes.io/projected/0a49527e-c963-4f2c-8e8c-5f2a879ac281-kube-api-access-g8sf8\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747850 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fd48e28d-bcdd-4bba-a540-0213cda9599a-images\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747886 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81a2f08a-36b5-487c-8222-f303e055755f-trusted-ca\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747905 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a2f08a-36b5-487c-8222-f303e055755f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747931 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-image-import-ca\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747954 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-oauth-config\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.747970 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-config\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748032 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8k7\" (UniqueName: \"kubernetes.io/projected/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-kube-api-access-4s8k7\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748068 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748087 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-etcd-client\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748104 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748127 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4704aedc-31a3-4890-a1f9-1fc6533caae0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748149 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw84n\" (UniqueName: \"kubernetes.io/projected/3569f4d9-01f3-47be-bc27-af6a7cb999a1-kube-api-access-pw84n\") pod \"downloads-7954f5f757-cprgf\" (UID: \"3569f4d9-01f3-47be-bc27-af6a7cb999a1\") " pod="openshift-console/downloads-7954f5f757-cprgf" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748164 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748165 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-trusted-ca-bundle\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748180 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748199 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb56b\" (UniqueName: \"kubernetes.io/projected/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-kube-api-access-xb56b\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pc42\" (UniqueName: \"kubernetes.io/projected/61f0fae5-d6d9-4f7b-b163-72316a316a37-kube-api-access-4pc42\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748253 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8377f033-b2c4-426e-bf41-3de032a37373-serving-cert\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748275 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a7f311-e393-45f7-9b05-28590202d310-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748295 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88631f5e-5bc9-450f-a46f-b81b6514afdd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748326 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748343 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748360 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8377f033-b2c4-426e-bf41-3de032a37373-etcd-client\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748405 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-policies\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748423 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd48e28d-bcdd-4bba-a540-0213cda9599a-config\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748442 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-serving-cert\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748460 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-service-ca\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748479 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0584184a-1f77-477d-8988-df9f60c5b194-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-auth-proxy-config\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748511 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a7f311-e393-45f7-9b05-28590202d310-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748532 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-config\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748551 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-etcd-service-ca\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748569 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4704aedc-31a3-4890-a1f9-1fc6533caae0-images\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748586 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-audit-dir\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748602 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a7f311-e393-45f7-9b05-28590202d310-config\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748619 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88631f5e-5bc9-450f-a46f-b81b6514afdd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748634 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-audit-policies\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748651 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748670 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4704aedc-31a3-4890-a1f9-1fc6533caae0-proxy-tls\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748689 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-dir\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-config\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748705 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-etcd-serving-ca\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748754 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0a49527e-c963-4f2c-8e8c-5f2a879ac281-node-pullsecrets\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748783 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748802 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748809 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61f0fae5-d6d9-4f7b-b163-72316a316a37-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748873 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748899 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748932 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748958 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg4bn\" (UniqueName: \"kubernetes.io/projected/ea88a306-e701-4a49-b4d2-7c4b62372c06-kube-api-access-bg4bn\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.748991 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fd48e28d-bcdd-4bba-a540-0213cda9599a-images\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749030 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg8w8\" (UniqueName: \"kubernetes.io/projected/fd48e28d-bcdd-4bba-a540-0213cda9599a-kube-api-access-dg8w8\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749057 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0584184a-1f77-477d-8988-df9f60c5b194-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-etcd-client\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749109 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-encryption-config\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749133 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5b7c\" (UniqueName: \"kubernetes.io/projected/8377f033-b2c4-426e-bf41-3de032a37373-kube-api-access-d5b7c\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749158 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749182 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-audit\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749264 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-etcd-serving-ca\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749776 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-etcd-ca\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749798 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-audit\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.749978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/61f0fae5-d6d9-4f7b-b163-72316a316a37-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.750114 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0a49527e-c963-4f2c-8e8c-5f2a879ac281-node-pullsecrets\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.750269 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81a2f08a-36b5-487c-8222-f303e055755f-trusted-ca\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.752086 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a49527e-c963-4f2c-8e8c-5f2a879ac281-audit-dir\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.752270 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-client-ca\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.752335 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-config\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.752429 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.752714 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.752879 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.753106 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.753357 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-console-config\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.753695 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.754311 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-audit-dir\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.755044 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8377f033-b2c4-426e-bf41-3de032a37373-etcd-service-ca\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.755226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a7f311-e393-45f7-9b05-28590202d310-config\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.755891 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-trusted-ca-bundle\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.755977 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.758583 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.756266 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.756415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd48e28d-bcdd-4bba-a540-0213cda9599a-config\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.756553 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-policies\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.753337 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-image-import-ca\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.756692 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88631f5e-5bc9-450f-a46f-b81b6514afdd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.756982 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-serving-cert\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.757572 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-auth-proxy-config\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.757976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-service-ca-bundle\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.758163 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-audit-policies\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.758213 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-dir\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.759063 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.759525 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-serving-cert\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.759769 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-oauth-serving-cert\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.759972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-config\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.760639 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-service-ca\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.760785 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.756140 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-etcd-client\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.760847 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8377f033-b2c4-426e-bf41-3de032a37373-serving-cert\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.761223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a49527e-c963-4f2c-8e8c-5f2a879ac281-config\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.761465 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0584184a-1f77-477d-8988-df9f60c5b194-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.761502 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88631f5e-5bc9-450f-a46f-b81b6514afdd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762039 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-oauth-config\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762066 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd48e28d-bcdd-4bba-a540-0213cda9599a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762143 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8377f033-b2c4-426e-bf41-3de032a37373-etcd-client\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762494 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762541 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-serving-cert\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762545 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762696 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-encryption-config\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762809 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-serving-cert\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.762893 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.763356 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-etcd-client\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.765119 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.765255 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.765322 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.765333 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0a7f311-e393-45f7-9b05-28590202d310-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.765605 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.765622 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-machine-approver-tls\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.765809 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61f0fae5-d6d9-4f7b-b163-72316a316a37-serving-cert\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.766035 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2f08a-36b5-487c-8222-f303e055755f-metrics-tls\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.766195 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/87e5c483-a4bb-46b6-add5-1d1638cbbbab-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-hhjh9\" (UID: \"87e5c483-a4bb-46b6-add5-1d1638cbbbab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.766274 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0a49527e-c963-4f2c-8e8c-5f2a879ac281-encryption-config\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.766738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-serving-cert\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.767759 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.776351 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.779079 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0584184a-1f77-477d-8988-df9f60c5b194-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.796541 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.815725 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.822415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.835545 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.850011 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4704aedc-31a3-4890-a1f9-1fc6533caae0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.850142 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4704aedc-31a3-4890-a1f9-1fc6533caae0-images\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.850173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.850202 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4704aedc-31a3-4890-a1f9-1fc6533caae0-proxy-tls\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.850318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cwk2\" (UniqueName: \"kubernetes.io/projected/4704aedc-31a3-4890-a1f9-1fc6533caae0-kube-api-access-2cwk2\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.851247 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4704aedc-31a3-4890-a1f9-1fc6533caae0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.851253 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.851379 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx2vk\" (UniqueName: \"kubernetes.io/projected/c61d535e-afb5-4006-a758-8bba8735a860-kube-api-access-fx2vk\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.861163 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.876055 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.878288 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.895347 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.916667 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.935557 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.955895 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.976202 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 09:17:44 crc kubenswrapper[4830]: I0311 09:17:44.995769 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.016181 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.036231 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.066289 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.076072 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.095924 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.116706 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.135603 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.155761 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.175480 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.184224 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.209329 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.213768 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.215611 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.235617 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.255747 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.275331 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.296307 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.340613 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.356431 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.376680 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.396703 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.415914 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.435763 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.455937 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.476837 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.496635 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.515387 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.535653 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.556274 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.576934 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.596571 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.616350 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.634886 4830 request.go:700] Waited for 1.006741724s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.637106 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.656472 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.662010 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4704aedc-31a3-4890-a1f9-1fc6533caae0-images\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.676427 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.685438 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4704aedc-31a3-4890-a1f9-1fc6533caae0-proxy-tls\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.695865 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.716124 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.735590 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.756543 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.776359 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.796259 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.817055 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.836692 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.856992 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.875886 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.896608 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.915954 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.937530 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.956596 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.975937 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 09:17:45 crc kubenswrapper[4830]: I0311 09:17:45.996672 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.014911 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.036568 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.056295 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.076489 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.096876 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.116891 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.136192 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.156591 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.176100 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.195481 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.216350 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.235415 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.255989 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.274789 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.296418 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.315379 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.335876 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.355862 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.376149 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.396186 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.415913 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.456077 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.477916 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.496425 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.516636 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.536329 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.556921 4830 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.600091 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nk42\" (UniqueName: \"kubernetes.io/projected/83f0f2b7-77f8-4ce3-bc3b-d24879087a86-kube-api-access-8nk42\") pod \"openshift-controller-manager-operator-756b6f6bc6-wzwg2\" (UID: \"83f0f2b7-77f8-4ce3-bc3b-d24879087a86\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.624659 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc9g\" (UniqueName: \"kubernetes.io/projected/cd450036-5201-4553-a9de-c08a7a9c9f52-kube-api-access-4kc9g\") pod \"console-f9d7485db-75j46\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.648208 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qdp\" (UniqueName: \"kubernetes.io/projected/88631f5e-5bc9-450f-a46f-b81b6514afdd-kube-api-access-l8qdp\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6zpx\" (UID: \"88631f5e-5bc9-450f-a46f-b81b6514afdd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.653895 4830 request.go:700] Waited for 1.904730765s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.666209 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58pcq\" (UniqueName: \"kubernetes.io/projected/87e5c483-a4bb-46b6-add5-1d1638cbbbab-kube-api-access-58pcq\") pod \"cluster-samples-operator-665b6dd947-hhjh9\" (UID: \"87e5c483-a4bb-46b6-add5-1d1638cbbbab\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.682487 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8sf8\" (UniqueName: \"kubernetes.io/projected/0a49527e-c963-4f2c-8e8c-5f2a879ac281-kube-api-access-g8sf8\") pod \"apiserver-76f77b778f-r4nnq\" (UID: \"0a49527e-c963-4f2c-8e8c-5f2a879ac281\") " pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.705972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a2f08a-36b5-487c-8222-f303e055755f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.722819 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6xt\" (UniqueName: \"kubernetes.io/projected/183cb9e1-fa20-48f1-8cb4-7bce808e72ed-kube-api-access-lq6xt\") pod \"authentication-operator-69f744f599-f87sw\" (UID: \"183cb9e1-fa20-48f1-8cb4-7bce808e72ed\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.746184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw84n\" (UniqueName: \"kubernetes.io/projected/3569f4d9-01f3-47be-bc27-af6a7cb999a1-kube-api-access-pw84n\") pod \"downloads-7954f5f757-cprgf\" (UID: \"3569f4d9-01f3-47be-bc27-af6a7cb999a1\") " pod="openshift-console/downloads-7954f5f757-cprgf" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.749404 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cprgf" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.762409 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.773466 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwjrq\" (UniqueName: \"kubernetes.io/projected/c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3-kube-api-access-cwjrq\") pod \"openshift-apiserver-operator-796bbdcf4f-5tmf8\" (UID: \"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.777366 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.784914 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.791955 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.811226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsqw8\" (UniqueName: \"kubernetes.io/projected/81a2f08a-36b5-487c-8222-f303e055755f-kube-api-access-dsqw8\") pod \"ingress-operator-5b745b69d9-426g8\" (UID: \"81a2f08a-36b5-487c-8222-f303e055755f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.814579 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb56b\" (UniqueName: \"kubernetes.io/projected/bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de-kube-api-access-xb56b\") pod \"apiserver-7bbb656c7d-ffkfl\" (UID: \"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.814838 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.823487 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.835994 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pc42\" (UniqueName: \"kubernetes.io/projected/61f0fae5-d6d9-4f7b-b163-72316a316a37-kube-api-access-4pc42\") pod \"openshift-config-operator-7777fb866f-xqggt\" (UID: \"61f0fae5-d6d9-4f7b-b163-72316a316a37\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.858990 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh5sc\" (UniqueName: \"kubernetes.io/projected/eacc4dea-3b3a-47c2-9c1e-f8392a7509bf-kube-api-access-gh5sc\") pod \"machine-approver-56656f9798-wdpdj\" (UID: \"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.875225 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8k7\" (UniqueName: \"kubernetes.io/projected/bc1c56ec-23e5-4b04-87c3-fbfb6775c07d-kube-api-access-4s8k7\") pod \"cluster-image-registry-operator-dc59b4c8b-cznhz\" (UID: \"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.878557 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.892905 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.893943 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a7f311-e393-45f7-9b05-28590202d310-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-h7j2f\" (UID: \"d0a7f311-e393-45f7-9b05-28590202d310\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.924142 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg4bn\" (UniqueName: \"kubernetes.io/projected/ea88a306-e701-4a49-b4d2-7c4b62372c06-kube-api-access-bg4bn\") pod \"oauth-openshift-558db77b4-k86x2\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.932604 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjprf\" (UniqueName: \"kubernetes.io/projected/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-kube-api-access-qjprf\") pod \"controller-manager-879f6c89f-l6dtx\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.934340 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.943166 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.953397 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0584184a-1f77-477d-8988-df9f60c5b194-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-knlpj\" (UID: \"0584184a-1f77-477d-8988-df9f60c5b194\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.960731 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.971323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg8w8\" (UniqueName: \"kubernetes.io/projected/fd48e28d-bcdd-4bba-a540-0213cda9599a-kube-api-access-dg8w8\") pod \"machine-api-operator-5694c8668f-s8cnh\" (UID: \"fd48e28d-bcdd-4bba-a540-0213cda9599a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:46 crc kubenswrapper[4830]: I0311 09:17:46.991315 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5b7c\" (UniqueName: \"kubernetes.io/projected/8377f033-b2c4-426e-bf41-3de032a37373-kube-api-access-d5b7c\") pod \"etcd-operator-b45778765-jf97s\" (UID: \"8377f033-b2c4-426e-bf41-3de032a37373\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.018507 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.019424 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cwk2\" (UniqueName: \"kubernetes.io/projected/4704aedc-31a3-4890-a1f9-1fc6533caae0-kube-api-access-2cwk2\") pod \"machine-config-operator-74547568cd-bdrq8\" (UID: \"4704aedc-31a3-4890-a1f9-1fc6533caae0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.031958 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.044033 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx2vk\" (UniqueName: \"kubernetes.io/projected/c61d535e-afb5-4006-a758-8bba8735a860-kube-api-access-fx2vk\") pod \"marketplace-operator-79b997595-pplbq\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.059187 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.070115 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.086597 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-metrics-tls\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.086639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-srv-cert\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.087226 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04656f2-1bad-494e-a8d9-9671fe42431e-config\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.087533 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-certificates\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.087583 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-default-certificate\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088126 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-profile-collector-cert\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088161 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhcc\" (UniqueName: \"kubernetes.io/projected/49ec9e50-30cd-4028-bf0d-ac67afb7e344-kube-api-access-5zhcc\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088197 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plm7\" (UniqueName: \"kubernetes.io/projected/3ba20c1f-29f0-4784-8683-621c75daffb3-kube-api-access-8plm7\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088217 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-metrics-certs\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088274 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxx4j\" (UniqueName: \"kubernetes.io/projected/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-kube-api-access-cxx4j\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088299 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199126b4-f145-46ee-9018-ab173f1267f7-config\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088334 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-tls\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088357 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84392623-69c8-4e0c-a3bf-59a57b487494-signing-cabundle\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088436 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dec76324-c87e-445a-9236-a674976e81dc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sbhg5\" (UID: \"dec76324-c87e-445a-9236-a674976e81dc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088562 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4kc5\" (UniqueName: \"kubernetes.io/projected/31e79cbc-4d50-4925-9b5b-7a5012de13a0-kube-api-access-d4kc5\") pod \"dns-operator-744455d44c-jr5k9\" (UID: \"31e79cbc-4d50-4925-9b5b-7a5012de13a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088607 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx68s\" (UniqueName: \"kubernetes.io/projected/e29d573c-753f-4d0d-8c55-3e2842703508-kube-api-access-tx68s\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088627 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199126b4-f145-46ee-9018-ab173f1267f7-serving-cert\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.088644 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:47.588629264 +0000 UTC m=+235.369779953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088680 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5rdt\" (UniqueName: \"kubernetes.io/projected/199126b4-f145-46ee-9018-ab173f1267f7-kube-api-access-s5rdt\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.088705 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31e79cbc-4d50-4925-9b5b-7a5012de13a0-metrics-tls\") pod \"dns-operator-744455d44c-jr5k9\" (UID: \"31e79cbc-4d50-4925-9b5b-7a5012de13a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089252 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-trusted-ca\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089359 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-stats-auth\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089439 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98xrz\" (UniqueName: \"kubernetes.io/projected/0da359c8-ec5a-4e8b-9632-7b5aba7856d0-kube-api-access-98xrz\") pod \"migrator-59844c95c7-6hzgd\" (UID: \"0da359c8-ec5a-4e8b-9632-7b5aba7856d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089488 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efd0238e-2294-4b23-ab03-88e149c4a0c9-config-volume\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089518 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-config\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzghj\" (UniqueName: \"kubernetes.io/projected/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-kube-api-access-xzghj\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089586 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz67z\" (UniqueName: \"kubernetes.io/projected/b04656f2-1bad-494e-a8d9-9671fe42431e-kube-api-access-pz67z\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089641 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tqscv\" (UID: \"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089809 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sj5g\" (UniqueName: \"kubernetes.io/projected/efd0238e-2294-4b23-ab03-88e149c4a0c9-kube-api-access-2sj5g\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089865 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ba20c1f-29f0-4784-8683-621c75daffb3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089893 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prrjl\" (UniqueName: \"kubernetes.io/projected/dc054b59-4478-4188-a9a5-12bb26b68c96-kube-api-access-prrjl\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.089936 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa05ab98-eec8-4db5-a84a-9d145cc84e56-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091178 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qh2\" (UniqueName: \"kubernetes.io/projected/680233cf-fda8-402e-95a6-a596a0edd470-kube-api-access-l9qh2\") pod \"auto-csr-approver-29553676-chghx\" (UID: \"680233cf-fda8-402e-95a6-a596a0edd470\") " pod="openshift-infra/auto-csr-approver-29553676-chghx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091217 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-tmpfs\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091244 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19e5e854-acf9-4b7e-9088-5c8a50ea0186-cert\") pod \"ingress-canary-k476p\" (UID: \"19e5e854-acf9-4b7e-9088-5c8a50ea0186\") " pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091343 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-apiservice-cert\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091375 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa05ab98-eec8-4db5-a84a-9d145cc84e56-config\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091444 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-bound-sa-token\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091468 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwnq2\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-kube-api-access-gwnq2\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091515 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmbm\" (UniqueName: \"kubernetes.io/projected/02fc0c48-db54-4b8c-8642-287a4720251f-kube-api-access-5xmbm\") pod \"package-server-manager-789f6589d5-zdtbz\" (UID: \"02fc0c48-db54-4b8c-8642-287a4720251f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091667 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-config-volume\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091702 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199126b4-f145-46ee-9018-ab173f1267f7-trusted-ca\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091724 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efd0238e-2294-4b23-ab03-88e149c4a0c9-secret-volume\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091776 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-client-ca\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091797 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ba20c1f-29f0-4784-8683-621c75daffb3-proxy-tls\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091820 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-webhook-cert\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091838 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwvdb\" (UniqueName: \"kubernetes.io/projected/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-kube-api-access-kwvdb\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091853 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/02fc0c48-db54-4b8c-8642-287a4720251f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zdtbz\" (UID: \"02fc0c48-db54-4b8c-8642-287a4720251f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091871 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn5rf\" (UniqueName: \"kubernetes.io/projected/dec76324-c87e-445a-9236-a674976e81dc-kube-api-access-mn5rf\") pod \"multus-admission-controller-857f4d67dd-sbhg5\" (UID: \"dec76324-c87e-445a-9236-a674976e81dc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091915 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvbqr\" (UniqueName: \"kubernetes.io/projected/84392623-69c8-4e0c-a3bf-59a57b487494-kube-api-access-zvbqr\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091934 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa05ab98-eec8-4db5-a84a-9d145cc84e56-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091950 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e29d573c-753f-4d0d-8c55-3e2842703508-srv-cert\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.091966 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brk9z\" (UniqueName: \"kubernetes.io/projected/19e5e854-acf9-4b7e-9088-5c8a50ea0186-kube-api-access-brk9z\") pod \"ingress-canary-k476p\" (UID: \"19e5e854-acf9-4b7e-9088-5c8a50ea0186\") " pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.092001 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e29d573c-753f-4d0d-8c55-3e2842703508-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.092092 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84392623-69c8-4e0c-a3bf-59a57b487494-signing-key\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.092113 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc054b59-4478-4188-a9a5-12bb26b68c96-service-ca-bundle\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.092136 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04656f2-1bad-494e-a8d9-9671fe42431e-serving-cert\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.092190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.092215 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqdh7\" (UniqueName: \"kubernetes.io/projected/cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c-kube-api-access-zqdh7\") pod \"control-plane-machine-set-operator-78cbb6b69f-tqscv\" (UID: \"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.092249 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ec9e50-30cd-4028-bf0d-ac67afb7e344-serving-cert\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.100832 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.106376 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.146680 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.194306 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.197296 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:47.697269102 +0000 UTC m=+235.478419791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqdh7\" (UniqueName: \"kubernetes.io/projected/cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c-kube-api-access-zqdh7\") pod \"control-plane-machine-set-operator-78cbb6b69f-tqscv\" (UID: \"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197388 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/310d0571-8565-4c51-b88f-791faae59e94-node-bootstrap-token\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197427 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgvlj\" (UniqueName: \"kubernetes.io/projected/310d0571-8565-4c51-b88f-791faae59e94-kube-api-access-dgvlj\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197448 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ec9e50-30cd-4028-bf0d-ac67afb7e344-serving-cert\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-mountpoint-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197504 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-metrics-tls\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197521 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-srv-cert\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197539 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04656f2-1bad-494e-a8d9-9671fe42431e-config\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197554 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-csi-data-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197576 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-default-certificate\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197606 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-certificates\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197661 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-profile-collector-cert\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197678 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhcc\" (UniqueName: \"kubernetes.io/projected/49ec9e50-30cd-4028-bf0d-ac67afb7e344-kube-api-access-5zhcc\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197712 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197728 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plm7\" (UniqueName: \"kubernetes.io/projected/3ba20c1f-29f0-4784-8683-621c75daffb3-kube-api-access-8plm7\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197744 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-metrics-certs\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197758 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxx4j\" (UniqueName: \"kubernetes.io/projected/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-kube-api-access-cxx4j\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197775 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-socket-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197797 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199126b4-f145-46ee-9018-ab173f1267f7-config\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197813 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197831 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-tls\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197855 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84392623-69c8-4e0c-a3bf-59a57b487494-signing-cabundle\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197877 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dec76324-c87e-445a-9236-a674976e81dc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sbhg5\" (UID: \"dec76324-c87e-445a-9236-a674976e81dc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197903 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4kc5\" (UniqueName: \"kubernetes.io/projected/31e79cbc-4d50-4925-9b5b-7a5012de13a0-kube-api-access-d4kc5\") pod \"dns-operator-744455d44c-jr5k9\" (UID: \"31e79cbc-4d50-4925-9b5b-7a5012de13a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197924 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx68s\" (UniqueName: \"kubernetes.io/projected/e29d573c-753f-4d0d-8c55-3e2842703508-kube-api-access-tx68s\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197941 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rdt\" (UniqueName: \"kubernetes.io/projected/199126b4-f145-46ee-9018-ab173f1267f7-kube-api-access-s5rdt\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197956 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31e79cbc-4d50-4925-9b5b-7a5012de13a0-metrics-tls\") pod \"dns-operator-744455d44c-jr5k9\" (UID: \"31e79cbc-4d50-4925-9b5b-7a5012de13a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197982 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199126b4-f145-46ee-9018-ab173f1267f7-serving-cert\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.197997 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-trusted-ca\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198028 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-stats-auth\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198048 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98xrz\" (UniqueName: \"kubernetes.io/projected/0da359c8-ec5a-4e8b-9632-7b5aba7856d0-kube-api-access-98xrz\") pod \"migrator-59844c95c7-6hzgd\" (UID: \"0da359c8-ec5a-4e8b-9632-7b5aba7856d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198067 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efd0238e-2294-4b23-ab03-88e149c4a0c9-config-volume\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198081 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-config\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198097 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz67z\" (UniqueName: \"kubernetes.io/projected/b04656f2-1bad-494e-a8d9-9671fe42431e-kube-api-access-pz67z\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzghj\" (UniqueName: \"kubernetes.io/projected/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-kube-api-access-xzghj\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tqscv\" (UID: \"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198156 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sj5g\" (UniqueName: \"kubernetes.io/projected/efd0238e-2294-4b23-ab03-88e149c4a0c9-kube-api-access-2sj5g\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ba20c1f-29f0-4784-8683-621c75daffb3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198189 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prrjl\" (UniqueName: \"kubernetes.io/projected/dc054b59-4478-4188-a9a5-12bb26b68c96-kube-api-access-prrjl\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198228 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa05ab98-eec8-4db5-a84a-9d145cc84e56-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-tmpfs\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198263 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qh2\" (UniqueName: \"kubernetes.io/projected/680233cf-fda8-402e-95a6-a596a0edd470-kube-api-access-l9qh2\") pod \"auto-csr-approver-29553676-chghx\" (UID: \"680233cf-fda8-402e-95a6-a596a0edd470\") " pod="openshift-infra/auto-csr-approver-29553676-chghx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198281 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19e5e854-acf9-4b7e-9088-5c8a50ea0186-cert\") pod \"ingress-canary-k476p\" (UID: \"19e5e854-acf9-4b7e-9088-5c8a50ea0186\") " pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198302 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-plugins-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198323 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-apiservice-cert\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198373 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa05ab98-eec8-4db5-a84a-9d145cc84e56-config\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198391 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/310d0571-8565-4c51-b88f-791faae59e94-certs\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svq5s\" (UniqueName: \"kubernetes.io/projected/34b4e37c-7920-4056-b0f9-38606804f021-kube-api-access-svq5s\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-bound-sa-token\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwnq2\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-kube-api-access-gwnq2\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198474 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmbm\" (UniqueName: \"kubernetes.io/projected/02fc0c48-db54-4b8c-8642-287a4720251f-kube-api-access-5xmbm\") pod \"package-server-manager-789f6589d5-zdtbz\" (UID: \"02fc0c48-db54-4b8c-8642-287a4720251f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198493 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-config-volume\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198510 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199126b4-f145-46ee-9018-ab173f1267f7-trusted-ca\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198526 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efd0238e-2294-4b23-ab03-88e149c4a0c9-secret-volume\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198552 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ba20c1f-29f0-4784-8683-621c75daffb3-proxy-tls\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198568 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-client-ca\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198584 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/02fc0c48-db54-4b8c-8642-287a4720251f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zdtbz\" (UID: \"02fc0c48-db54-4b8c-8642-287a4720251f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-webhook-cert\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198613 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwvdb\" (UniqueName: \"kubernetes.io/projected/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-kube-api-access-kwvdb\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198631 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn5rf\" (UniqueName: \"kubernetes.io/projected/dec76324-c87e-445a-9236-a674976e81dc-kube-api-access-mn5rf\") pod \"multus-admission-controller-857f4d67dd-sbhg5\" (UID: \"dec76324-c87e-445a-9236-a674976e81dc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198649 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa05ab98-eec8-4db5-a84a-9d145cc84e56-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198666 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvbqr\" (UniqueName: \"kubernetes.io/projected/84392623-69c8-4e0c-a3bf-59a57b487494-kube-api-access-zvbqr\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198683 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e29d573c-753f-4d0d-8c55-3e2842703508-srv-cert\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198698 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brk9z\" (UniqueName: \"kubernetes.io/projected/19e5e854-acf9-4b7e-9088-5c8a50ea0186-kube-api-access-brk9z\") pod \"ingress-canary-k476p\" (UID: \"19e5e854-acf9-4b7e-9088-5c8a50ea0186\") " pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198713 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-registration-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e29d573c-753f-4d0d-8c55-3e2842703508-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198767 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84392623-69c8-4e0c-a3bf-59a57b487494-signing-key\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.198810 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc054b59-4478-4188-a9a5-12bb26b68c96-service-ca-bundle\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.199494 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b04656f2-1bad-494e-a8d9-9671fe42431e-config\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.200980 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04656f2-1bad-494e-a8d9-9671fe42431e-serving-cert\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.207790 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-profile-collector-cert\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.208089 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199126b4-f145-46ee-9018-ab173f1267f7-config\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.208333 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-certificates\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.208521 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-default-certificate\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.208696 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-metrics-tls\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.208856 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-apiservice-cert\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.208882 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-srv-cert\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.209292 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/02fc0c48-db54-4b8c-8642-287a4720251f-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zdtbz\" (UID: \"02fc0c48-db54-4b8c-8642-287a4720251f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.209330 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-config\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.209904 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04656f2-1bad-494e-a8d9-9671fe42431e-serving-cert\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.210369 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:47.710356293 +0000 UTC m=+235.491506982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.210500 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-tmpfs\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.211212 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-config-volume\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.211460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/199126b4-f145-46ee-9018-ab173f1267f7-trusted-ca\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.212422 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-tls\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.213905 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/84392623-69c8-4e0c-a3bf-59a57b487494-signing-cabundle\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.216137 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-webhook-cert\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.216702 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dec76324-c87e-445a-9236-a674976e81dc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sbhg5\" (UID: \"dec76324-c87e-445a-9236-a674976e81dc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.216998 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ba20c1f-29f0-4784-8683-621c75daffb3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.217136 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efd0238e-2294-4b23-ab03-88e149c4a0c9-secret-volume\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.217809 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa05ab98-eec8-4db5-a84a-9d145cc84e56-config\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.217870 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efd0238e-2294-4b23-ab03-88e149c4a0c9-config-volume\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.218267 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa05ab98-eec8-4db5-a84a-9d145cc84e56-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.218507 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc054b59-4478-4188-a9a5-12bb26b68c96-service-ca-bundle\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.218514 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-client-ca\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.221834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199126b4-f145-46ee-9018-ab173f1267f7-serving-cert\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.222093 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31e79cbc-4d50-4925-9b5b-7a5012de13a0-metrics-tls\") pod \"dns-operator-744455d44c-jr5k9\" (UID: \"31e79cbc-4d50-4925-9b5b-7a5012de13a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.222420 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/84392623-69c8-4e0c-a3bf-59a57b487494-signing-key\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.222506 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.223569 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-metrics-certs\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.224469 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ec9e50-30cd-4028-bf0d-ac67afb7e344-serving-cert\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.224971 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ba20c1f-29f0-4784-8683-621c75daffb3-proxy-tls\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.227515 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-tqscv\" (UID: \"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.229387 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.237166 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-trusted-ca\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.237652 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dc054b59-4478-4188-a9a5-12bb26b68c96-stats-auth\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.239329 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/19e5e854-acf9-4b7e-9088-5c8a50ea0186-cert\") pod \"ingress-canary-k476p\" (UID: \"19e5e854-acf9-4b7e-9088-5c8a50ea0186\") " pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.239616 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.239820 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e29d573c-753f-4d0d-8c55-3e2842703508-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.243753 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e29d573c-753f-4d0d-8c55-3e2842703508-srv-cert\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.244004 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqdh7\" (UniqueName: \"kubernetes.io/projected/cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c-kube-api-access-zqdh7\") pod \"control-plane-machine-set-operator-78cbb6b69f-tqscv\" (UID: \"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.246386 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.255162 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.278168 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhcc\" (UniqueName: \"kubernetes.io/projected/49ec9e50-30cd-4028-bf0d-ac67afb7e344-kube-api-access-5zhcc\") pod \"route-controller-manager-6576b87f9c-xm7n9\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.284578 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxx4j\" (UniqueName: \"kubernetes.io/projected/3f48d0f5-0913-47f3-a1ca-10e10359bc3d-kube-api-access-cxx4j\") pod \"packageserver-d55dfcdfc-x98hk\" (UID: \"3f48d0f5-0913-47f3-a1ca-10e10359bc3d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.297930 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prrjl\" (UniqueName: \"kubernetes.io/projected/dc054b59-4478-4188-a9a5-12bb26b68c96-kube-api-access-prrjl\") pod \"router-default-5444994796-j29w2\" (UID: \"dc054b59-4478-4188-a9a5-12bb26b68c96\") " pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.304715 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.304959 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-plugins-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.304983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/310d0571-8565-4c51-b88f-791faae59e94-certs\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.304999 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svq5s\" (UniqueName: \"kubernetes.io/projected/34b4e37c-7920-4056-b0f9-38606804f021-kube-api-access-svq5s\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-registration-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305103 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/310d0571-8565-4c51-b88f-791faae59e94-node-bootstrap-token\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305118 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgvlj\" (UniqueName: \"kubernetes.io/projected/310d0571-8565-4c51-b88f-791faae59e94-kube-api-access-dgvlj\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305136 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-mountpoint-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305153 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-csi-data-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305207 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-socket-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305484 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-socket-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.305555 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:47.805540769 +0000 UTC m=+235.586691458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.305581 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-plugins-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.307074 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-mountpoint-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.307320 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-csi-data-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.307407 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/34b4e37c-7920-4056-b0f9-38606804f021-registration-dir\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.311253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/310d0571-8565-4c51-b88f-791faae59e94-certs\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.311868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/310d0571-8565-4c51-b88f-791faae59e94-node-bootstrap-token\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.321543 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aa05ab98-eec8-4db5-a84a-9d145cc84e56-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lkf72\" (UID: \"aa05ab98-eec8-4db5-a84a-9d145cc84e56\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.330925 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-f87sw"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.331913 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cprgf"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.331989 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9"] Mar 11 09:17:47 crc kubenswrapper[4830]: W0311 09:17:47.333691 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183cb9e1_fa20_48f1_8cb4_7bce808e72ed.slice/crio-1ddbee3b3eecc591ec9bc2439cf5adf50205b250fbfb782956188a8d83621566 WatchSource:0}: Error finding container 1ddbee3b3eecc591ec9bc2439cf5adf50205b250fbfb782956188a8d83621566: Status 404 returned error can't find the container with id 1ddbee3b3eecc591ec9bc2439cf5adf50205b250fbfb782956188a8d83621566 Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.337313 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-r4nnq"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.341580 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.350372 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qh2\" (UniqueName: \"kubernetes.io/projected/680233cf-fda8-402e-95a6-a596a0edd470-kube-api-access-l9qh2\") pod \"auto-csr-approver-29553676-chghx\" (UID: \"680233cf-fda8-402e-95a6-a596a0edd470\") " pod="openshift-infra/auto-csr-approver-29553676-chghx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.372996 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4kc5\" (UniqueName: \"kubernetes.io/projected/31e79cbc-4d50-4925-9b5b-7a5012de13a0-kube-api-access-d4kc5\") pod \"dns-operator-744455d44c-jr5k9\" (UID: \"31e79cbc-4d50-4925-9b5b-7a5012de13a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.392691 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx68s\" (UniqueName: \"kubernetes.io/projected/e29d573c-753f-4d0d-8c55-3e2842703508-kube-api-access-tx68s\") pod \"olm-operator-6b444d44fb-wbrh6\" (UID: \"e29d573c-753f-4d0d-8c55-3e2842703508\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.406179 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.406574 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:47.906563505 +0000 UTC m=+235.687714194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.415684 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brk9z\" (UniqueName: \"kubernetes.io/projected/19e5e854-acf9-4b7e-9088-5c8a50ea0186-kube-api-access-brk9z\") pod \"ingress-canary-k476p\" (UID: \"19e5e854-acf9-4b7e-9088-5c8a50ea0186\") " pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.435792 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwvdb\" (UniqueName: \"kubernetes.io/projected/80895ee9-5c6f-4e2d-b7c0-88b50d57a720-kube-api-access-kwvdb\") pod \"catalog-operator-68c6474976-6j6fc\" (UID: \"80895ee9-5c6f-4e2d-b7c0-88b50d57a720\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.437753 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.451626 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.459551 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn5rf\" (UniqueName: \"kubernetes.io/projected/dec76324-c87e-445a-9236-a674976e81dc-kube-api-access-mn5rf\") pod \"multus-admission-controller-857f4d67dd-sbhg5\" (UID: \"dec76324-c87e-445a-9236-a674976e81dc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.460101 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-75j46"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.473215 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.473337 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz67z\" (UniqueName: \"kubernetes.io/projected/b04656f2-1bad-494e-a8d9-9671fe42431e-kube-api-access-pz67z\") pod \"service-ca-operator-777779d784-9r6tk\" (UID: \"b04656f2-1bad-494e-a8d9-9671fe42431e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.474134 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-426g8"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.489368 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98xrz\" (UniqueName: \"kubernetes.io/projected/0da359c8-ec5a-4e8b-9632-7b5aba7856d0-kube-api-access-98xrz\") pod \"migrator-59844c95c7-6hzgd\" (UID: \"0da359c8-ec5a-4e8b-9632-7b5aba7856d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.504168 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.507070 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.507213 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.007194339 +0000 UTC m=+235.788345028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.507312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.507618 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.007604611 +0000 UTC m=+235.788755300 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.515765 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmbm\" (UniqueName: \"kubernetes.io/projected/02fc0c48-db54-4b8c-8642-287a4720251f-kube-api-access-5xmbm\") pod \"package-server-manager-789f6589d5-zdtbz\" (UID: \"02fc0c48-db54-4b8c-8642-287a4720251f\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.526475 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.535560 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-bound-sa-token\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.562131 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.569243 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.569838 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwnq2\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-kube-api-access-gwnq2\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.572133 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xqggt"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.572966 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.574543 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sj5g\" (UniqueName: \"kubernetes.io/projected/efd0238e-2294-4b23-ab03-88e149c4a0c9-kube-api-access-2sj5g\") pod \"collect-profiles-29553675-2mnnx\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.577088 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.580618 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.583304 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" Mar 11 09:17:47 crc kubenswrapper[4830]: W0311 09:17:47.583636 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61f0fae5_d6d9_4f7b_b163_72316a316a37.slice/crio-ea2efb6b0c37366831d486ec2649b1dd734f8dbfd6efc990dda69a2d84e7cee1 WatchSource:0}: Error finding container ea2efb6b0c37366831d486ec2649b1dd734f8dbfd6efc990dda69a2d84e7cee1: Status 404 returned error can't find the container with id ea2efb6b0c37366831d486ec2649b1dd734f8dbfd6efc990dda69a2d84e7cee1 Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.591251 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.594329 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvbqr\" (UniqueName: \"kubernetes.io/projected/84392623-69c8-4e0c-a3bf-59a57b487494-kube-api-access-zvbqr\") pod \"service-ca-9c57cc56f-nfnzk\" (UID: \"84392623-69c8-4e0c-a3bf-59a57b487494\") " pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.598658 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.609179 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.609588 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.109573955 +0000 UTC m=+235.890724644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.620130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzghj\" (UniqueName: \"kubernetes.io/projected/2c8dd96b-cc53-49ce-9f64-ec26e87c62ab-kube-api-access-xzghj\") pod \"dns-default-qn6h2\" (UID: \"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab\") " pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.639605 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553676-chghx" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.645718 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plm7\" (UniqueName: \"kubernetes.io/projected/3ba20c1f-29f0-4784-8683-621c75daffb3-kube-api-access-8plm7\") pod \"machine-config-controller-84d6567774-k8b27\" (UID: \"3ba20c1f-29f0-4784-8683-621c75daffb3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.646546 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-k476p" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.652297 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.656000 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rdt\" (UniqueName: \"kubernetes.io/projected/199126b4-f145-46ee-9018-ab173f1267f7-kube-api-access-s5rdt\") pod \"console-operator-58897d9998-hnjgz\" (UID: \"199126b4-f145-46ee-9018-ab173f1267f7\") " pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.660203 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.665375 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.674222 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgvlj\" (UniqueName: \"kubernetes.io/projected/310d0571-8565-4c51-b88f-791faae59e94-kube-api-access-dgvlj\") pod \"machine-config-server-lhp5r\" (UID: \"310d0571-8565-4c51-b88f-791faae59e94\") " pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.684158 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jf97s"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.691676 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.692702 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svq5s\" (UniqueName: \"kubernetes.io/projected/34b4e37c-7920-4056-b0f9-38606804f021-kube-api-access-svq5s\") pod \"csi-hostpathplugin-rj9w8\" (UID: \"34b4e37c-7920-4056-b0f9-38606804f021\") " pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.696164 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k86x2"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.726442 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.727305 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.227280007 +0000 UTC m=+236.008430686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.740786 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.755440 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.771004 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.812210 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.813700 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.815893 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pplbq"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.815936 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.830944 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.831179 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.331157615 +0000 UTC m=+236.112308304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.831970 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.832704 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.33268849 +0000 UTC m=+236.113839179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.895877 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv"] Mar 11 09:17:47 crc kubenswrapper[4830]: W0311 09:17:47.903811 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1c56ec_23e5_4b04_87c3_fbfb6775c07d.slice/crio-fbe15e48674b57c0d8d12bd64fadef6be14ab5738580c7dbbc113268a85a7203 WatchSource:0}: Error finding container fbe15e48674b57c0d8d12bd64fadef6be14ab5738580c7dbbc113268a85a7203: Status 404 returned error can't find the container with id fbe15e48674b57c0d8d12bd64fadef6be14ab5738580c7dbbc113268a85a7203 Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.916620 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-s8cnh"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.918272 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.924409 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6dtx"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.934389 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:47 crc kubenswrapper[4830]: E0311 09:17:47.934837 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.434813267 +0000 UTC m=+236.215964006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.948703 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.964496 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72"] Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.971561 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lhp5r" Mar 11 09:17:47 crc kubenswrapper[4830]: I0311 09:17:47.977875 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.004278 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jr5k9"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.035816 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.036126 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.5361153 +0000 UTC m=+236.317265989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.067939 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" event={"ID":"87e5c483-a4bb-46b6-add5-1d1638cbbbab","Type":"ContainerStarted","Data":"e189ff2936a94828bac7268c138902e2d80cf8b13416e3802eef25d437619e11"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.068840 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" event={"ID":"81a2f08a-36b5-487c-8222-f303e055755f","Type":"ContainerStarted","Data":"be185b292c3d387b856223d3621774feabb8c2da4ecc588c294f85c028b706bb"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.070187 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" event={"ID":"88631f5e-5bc9-450f-a46f-b81b6514afdd","Type":"ContainerStarted","Data":"56340668c5ddf479fbecf0316a974070aa0479d429d5cdebcfb4c4e7eb3bee32"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.071138 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" event={"ID":"d0a7f311-e393-45f7-9b05-28590202d310","Type":"ContainerStarted","Data":"bb2f5316ff5a5c908d877bf78116f676091c545510bd35307396ed135e375149"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.072080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" event={"ID":"ea88a306-e701-4a49-b4d2-7c4b62372c06","Type":"ContainerStarted","Data":"c8de23314e89a229d5ca30f7c047ea2763189c7634e3c5d9ceccb8229ad6d0db"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.086965 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" event={"ID":"49ec9e50-30cd-4028-bf0d-ac67afb7e344","Type":"ContainerStarted","Data":"d7f71751d8b7f3b6a96be88417311b8d2813f6ba170427c81386e29fb762e615"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.132031 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" event={"ID":"c61d535e-afb5-4006-a758-8bba8735a860","Type":"ContainerStarted","Data":"7ae89d9a229f8f2e3380ede3842d1b0e91bfbefa876a9344f3df94d5ca4e794b"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.137514 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.137943 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.637924839 +0000 UTC m=+236.419075538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.138064 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd48e28d_bcdd_4bba_a540_0213cda9599a.slice/crio-d508f3b5c28fbc1183785714cfe711d60ab667fe5100e5ce16253230921ff692 WatchSource:0}: Error finding container d508f3b5c28fbc1183785714cfe711d60ab667fe5100e5ce16253230921ff692: Status 404 returned error can't find the container with id d508f3b5c28fbc1183785714cfe711d60ab667fe5100e5ce16253230921ff692 Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.152868 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" event={"ID":"8377f033-b2c4-426e-bf41-3de032a37373","Type":"ContainerStarted","Data":"68ce7a2a7df6481d260f14c42a1c7d7f54b69c65c3c9811b2519d2b137b615af"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.154965 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" event={"ID":"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de","Type":"ContainerStarted","Data":"1f992e3ead4be5a9ca2489b4232739aa05c77640f043b2b2557c8d511a2942cc"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.157517 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" event={"ID":"61f0fae5-d6d9-4f7b-b163-72316a316a37","Type":"ContainerStarted","Data":"ea2efb6b0c37366831d486ec2649b1dd734f8dbfd6efc990dda69a2d84e7cee1"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.160659 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" event={"ID":"183cb9e1-fa20-48f1-8cb4-7bce808e72ed","Type":"ContainerStarted","Data":"588b8bb5a3736eced848b1fabf7a03cfd1987c9c2bf60044d38cda8d6d534dfe"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.160697 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" event={"ID":"183cb9e1-fa20-48f1-8cb4-7bce808e72ed","Type":"ContainerStarted","Data":"1ddbee3b3eecc591ec9bc2439cf5adf50205b250fbfb782956188a8d83621566"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.164169 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cprgf" event={"ID":"3569f4d9-01f3-47be-bc27-af6a7cb999a1","Type":"ContainerStarted","Data":"2f1093cb9715a8e4a19ca6e39e9b0cc6c618c291b6abd9a94f3d7de92eb4fe73"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.164227 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cprgf" event={"ID":"3569f4d9-01f3-47be-bc27-af6a7cb999a1","Type":"ContainerStarted","Data":"5f14eff15dde842edf3b21c56bd22ffc418194fcb878a73bca970a26f325f863"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.164886 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-cprgf" Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.168084 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-cprgf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.168131 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cprgf" podUID="3569f4d9-01f3-47be-bc27-af6a7cb999a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.171511 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" event={"ID":"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d","Type":"ContainerStarted","Data":"fbe15e48674b57c0d8d12bd64fadef6be14ab5738580c7dbbc113268a85a7203"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.189407 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" event={"ID":"0584184a-1f77-477d-8988-df9f60c5b194","Type":"ContainerStarted","Data":"ecd49c9ca4231af63bf9dbc282dbe988b4dd57b79403a8e1bc24de9f371defb4"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.204047 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" event={"ID":"83f0f2b7-77f8-4ce3-bc3b-d24879087a86","Type":"ContainerStarted","Data":"e91a8aebc614026fe085b6428fff0c85fee2221be07a9fcfacfd420b25dcd70a"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.204092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" event={"ID":"83f0f2b7-77f8-4ce3-bc3b-d24879087a86","Type":"ContainerStarted","Data":"75e6421cad4c2713f659e8ccea7ea641b9b6c48e88b449962f35a13c3731f89b"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.207281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" event={"ID":"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf","Type":"ContainerStarted","Data":"a9e4d2d4327c127874a5dbb8c35358f97dfa48d9c7484326bba0d00830331984"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.207301 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" event={"ID":"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf","Type":"ContainerStarted","Data":"c05a7cc32973cea43077fded9117cb0029057c8f0a37cdeb8c1470b58061f615"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.218406 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-75j46" event={"ID":"cd450036-5201-4553-a9de-c08a7a9c9f52","Type":"ContainerStarted","Data":"39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.218442 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-75j46" event={"ID":"cd450036-5201-4553-a9de-c08a7a9c9f52","Type":"ContainerStarted","Data":"4155b31ee39b8de18e3327f73d3c10bca7896fd617502a3b86b7df2b5899ff62"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.222606 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" event={"ID":"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3","Type":"ContainerStarted","Data":"14e4c4c75d7708f334c208763366571b0c34104b9c2f65fa74cc885ec4de1a78"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.225180 4830 generic.go:334] "Generic (PLEG): container finished" podID="0a49527e-c963-4f2c-8e8c-5f2a879ac281" containerID="72f606299855bb8c86947372a3ee726a2a87cef15a7f333fc2b34dd439f1faa6" exitCode=0 Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.225218 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" event={"ID":"0a49527e-c963-4f2c-8e8c-5f2a879ac281","Type":"ContainerDied","Data":"72f606299855bb8c86947372a3ee726a2a87cef15a7f333fc2b34dd439f1faa6"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.225244 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" event={"ID":"0a49527e-c963-4f2c-8e8c-5f2a879ac281","Type":"ContainerStarted","Data":"749160ece7bf6fafd77bbcb11ab10987dbb13409ac9f020d4c331128dfc40b88"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.226252 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-j29w2" event={"ID":"dc054b59-4478-4188-a9a5-12bb26b68c96","Type":"ContainerStarted","Data":"503ed151aecae97309800962c124d7609ebd3ea267e07cfb9c2424095535d409"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.227118 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" event={"ID":"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c","Type":"ContainerStarted","Data":"f51a7ce8f2abe72f0bb2e7fad3a5397524f14cbb5d572e8b55fe7c1fa002f4a6"} Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.239912 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.240168 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.74015728 +0000 UTC m=+236.521307969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.284768 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.314416 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.344738 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.347080 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.847052187 +0000 UTC m=+236.628202876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.405421 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553676-chghx"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.417604 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.448972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.449316 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:48.949305448 +0000 UTC m=+236.730456137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.460347 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod310d0571_8565_4c51_b88f_791faae59e94.slice/crio-2437bb0000f55728278012e4b84ec2ee1ac9d2b92d0eab169ee5c700276a58ce WatchSource:0}: Error finding container 2437bb0000f55728278012e4b84ec2ee1ac9d2b92d0eab169ee5c700276a58ce: Status 404 returned error can't find the container with id 2437bb0000f55728278012e4b84ec2ee1ac9d2b92d0eab169ee5c700276a58ce Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.498564 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod680233cf_fda8_402e_95a6_a596a0edd470.slice/crio-eb33a8b76e0b36f1440e218b0e3689d921e460afff0ca753982e13615ac329c3 WatchSource:0}: Error finding container eb33a8b76e0b36f1440e218b0e3689d921e460afff0ca753982e13615ac329c3: Status 404 returned error can't find the container with id eb33a8b76e0b36f1440e218b0e3689d921e460afff0ca753982e13615ac329c3 Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.514681 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd0238e_2294_4b23_ab03_88e149c4a0c9.slice/crio-170f3051fe18aa334f41a168cc283c9b22fc9e3225fdc1200b30194c1bebc7f9 WatchSource:0}: Error finding container 170f3051fe18aa334f41a168cc283c9b22fc9e3225fdc1200b30194c1bebc7f9: Status 404 returned error can't find the container with id 170f3051fe18aa334f41a168cc283c9b22fc9e3225fdc1200b30194c1bebc7f9 Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.523452 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.542091 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.551095 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.551483 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.051467986 +0000 UTC m=+236.832618675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.654874 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.655792 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.155780698 +0000 UTC m=+236.936931377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.670996 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qn6h2"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.671931 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sbhg5"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.674996 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hnjgz"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.699189 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-k476p"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.756319 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rj9w8"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.757455 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.757558 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.257542345 +0000 UTC m=+237.038693034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.757929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.765078 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.265060505 +0000 UTC m=+237.046211184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.767913 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec76324_c87e_445a_9236_a674976e81dc.slice/crio-f8e0bc9217d11e122f35cbfc9836a3924c3e08e259a2c2430ac04c1079ba0cbd WatchSource:0}: Error finding container f8e0bc9217d11e122f35cbfc9836a3924c3e08e259a2c2430ac04c1079ba0cbd: Status 404 returned error can't find the container with id f8e0bc9217d11e122f35cbfc9836a3924c3e08e259a2c2430ac04c1079ba0cbd Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.805479 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nfnzk"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.809523 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.811661 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.832826 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27"] Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.858565 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.858958 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.358903671 +0000 UTC m=+237.140054350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.885617 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e5e854_acf9_4b7e_9088_5c8a50ea0186.slice/crio-27d2669b043176ad7e7cd5213354ea4321483dc1e7df279ed459001a8b4e1a9e WatchSource:0}: Error finding container 27d2669b043176ad7e7cd5213354ea4321483dc1e7df279ed459001a8b4e1a9e: Status 404 returned error can't find the container with id 27d2669b043176ad7e7cd5213354ea4321483dc1e7df279ed459001a8b4e1a9e Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.899045 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84392623_69c8_4e0c_a3bf_59a57b487494.slice/crio-f39063030edf33b62d2a1e47c047c3d1690efd00d341d681337a8d67f14adf43 WatchSource:0}: Error finding container f39063030edf33b62d2a1e47c047c3d1690efd00d341d681337a8d67f14adf43: Status 404 returned error can't find the container with id f39063030edf33b62d2a1e47c047c3d1690efd00d341d681337a8d67f14adf43 Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.900084 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b4e37c_7920_4056_b0f9_38606804f021.slice/crio-5092e5c4274718343e6987949ea9fb37f65cd292c5032012b8956629a018d660 WatchSource:0}: Error finding container 5092e5c4274718343e6987949ea9fb37f65cd292c5032012b8956629a018d660: Status 404 returned error can't find the container with id 5092e5c4274718343e6987949ea9fb37f65cd292c5032012b8956629a018d660 Mar 11 09:17:48 crc kubenswrapper[4830]: W0311 09:17:48.913290 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04656f2_1bad_494e_a8d9_9671fe42431e.slice/crio-1c488256a6b323075686eba3509dd439271ab2aefb3422ad2e22db2ad49f213a WatchSource:0}: Error finding container 1c488256a6b323075686eba3509dd439271ab2aefb3422ad2e22db2ad49f213a: Status 404 returned error can't find the container with id 1c488256a6b323075686eba3509dd439271ab2aefb3422ad2e22db2ad49f213a Mar 11 09:17:48 crc kubenswrapper[4830]: I0311 09:17:48.961322 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:48 crc kubenswrapper[4830]: E0311 09:17:48.962321 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.462308506 +0000 UTC m=+237.243459195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.071829 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.072721 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.572679994 +0000 UTC m=+237.353830713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.176785 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.178500 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.678488958 +0000 UTC m=+237.459639647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.230880 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-cprgf" podStartSLOduration=167.230861795 podStartE2EDuration="2m47.230861795s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.229595399 +0000 UTC m=+237.010746108" watchObservedRunningTime="2026-03-11 09:17:49.230861795 +0000 UTC m=+237.012012474" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.255731 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" event={"ID":"0a49527e-c963-4f2c-8e8c-5f2a879ac281","Type":"ContainerStarted","Data":"9c8f62562664a7e69d8aeb96bfa242fbfe0b5bf56f5f8f08d7e8fcd214d60c95"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.270208 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-f87sw" podStartSLOduration=167.270190863 podStartE2EDuration="2m47.270190863s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.269348928 +0000 UTC m=+237.050499617" watchObservedRunningTime="2026-03-11 09:17:49.270190863 +0000 UTC m=+237.051341562" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.280298 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.280790 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.78076905 +0000 UTC m=+237.561919739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.375995 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" event={"ID":"80895ee9-5c6f-4e2d-b7c0-88b50d57a720","Type":"ContainerStarted","Data":"b269b09e5c9638ad54053c0d1e85dbd4151f1a5d5aa6dbd1ec0dfea0f8e56c9c"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.382005 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wzwg2" podStartSLOduration=167.381990171 podStartE2EDuration="2m47.381990171s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.333314203 +0000 UTC m=+237.114464892" watchObservedRunningTime="2026-03-11 09:17:49.381990171 +0000 UTC m=+237.163140850" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.382922 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.383214 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.883203707 +0000 UTC m=+237.664354396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.383575 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-75j46" podStartSLOduration=167.383568408 podStartE2EDuration="2m47.383568408s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.381537669 +0000 UTC m=+237.162688348" watchObservedRunningTime="2026-03-11 09:17:49.383568408 +0000 UTC m=+237.164719097" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.397615 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" event={"ID":"c7509bc0-f2e5-4c08-9c41-9c980b8bf8e3","Type":"ContainerStarted","Data":"5239baeb805f9f029bbf3fd6692e84abe7a6e14b63e1bb8784beefc7d74ddbe6"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.417272 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" event={"ID":"aa05ab98-eec8-4db5-a84a-9d145cc84e56","Type":"ContainerStarted","Data":"d35f19c8c2a11ead49a83a99dc79e46f66becc488c19cb09dfc43f2735e81e15"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.417323 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" event={"ID":"aa05ab98-eec8-4db5-a84a-9d145cc84e56","Type":"ContainerStarted","Data":"a1a03821d0d29c135afb52f5d736330bc57de3d44014171e803bc9c584f25beb"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.427752 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5tmf8" podStartSLOduration=167.427738526 podStartE2EDuration="2m47.427738526s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.426894582 +0000 UTC m=+237.208045281" watchObservedRunningTime="2026-03-11 09:17:49.427738526 +0000 UTC m=+237.208889215" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.442360 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" event={"ID":"dec76324-c87e-445a-9236-a674976e81dc","Type":"ContainerStarted","Data":"f8e0bc9217d11e122f35cbfc9836a3924c3e08e259a2c2430ac04c1079ba0cbd"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.448638 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hnjgz" event={"ID":"199126b4-f145-46ee-9018-ab173f1267f7","Type":"ContainerStarted","Data":"65de659ca68f47fc5e124e9bf011ff4f1ad18baa087fd17f5eb4d66ce1867841"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.461236 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lkf72" podStartSLOduration=166.461219372 podStartE2EDuration="2m46.461219372s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.448556403 +0000 UTC m=+237.229707092" watchObservedRunningTime="2026-03-11 09:17:49.461219372 +0000 UTC m=+237.242370061" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.487800 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.488961 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:49.9889446 +0000 UTC m=+237.770095279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.491426 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" event={"ID":"4704aedc-31a3-4890-a1f9-1fc6533caae0","Type":"ContainerStarted","Data":"9ea8aa99f994844c1c768de2046792ef15a4c98faff645153035d941ba55d3a4"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.491567 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" event={"ID":"4704aedc-31a3-4890-a1f9-1fc6533caae0","Type":"ContainerStarted","Data":"5f25e363aa3fa889a59bcb18c348d1dbc711117dc8b049f256f70de28f442102"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.556260 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" event={"ID":"88631f5e-5bc9-450f-a46f-b81b6514afdd","Type":"ContainerStarted","Data":"6ae0cd731bd52212856a6d6115b98f6026f7c8c62284bd173f537958e8516230"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.571521 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" event={"ID":"3f48d0f5-0913-47f3-a1ca-10e10359bc3d","Type":"ContainerStarted","Data":"f8b144ce5e6df927d3c427b03e37f90546e129104e7c9c45c158f4bdf9dac5cd"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.571830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" event={"ID":"3f48d0f5-0913-47f3-a1ca-10e10359bc3d","Type":"ContainerStarted","Data":"4d6d444a1b5072aa14f0f824dd1576d720dfaacc341d9654633b4d9b69af7897"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.572199 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.581383 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6zpx" podStartSLOduration=166.581367575 podStartE2EDuration="2m46.581367575s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.581096008 +0000 UTC m=+237.362246707" watchObservedRunningTime="2026-03-11 09:17:49.581367575 +0000 UTC m=+237.362518254" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.585336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" event={"ID":"84392623-69c8-4e0c-a3bf-59a57b487494","Type":"ContainerStarted","Data":"f39063030edf33b62d2a1e47c047c3d1690efd00d341d681337a8d67f14adf43"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.594954 4830 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x98hk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" start-of-body= Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.594992 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" podUID="3f48d0f5-0913-47f3-a1ca-10e10359bc3d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.19:5443/healthz\": dial tcp 10.217.0.19:5443: connect: connection refused" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.595789 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.596107 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.096096754 +0000 UTC m=+237.877247443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.642374 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" podStartSLOduration=166.642343693 podStartE2EDuration="2m46.642343693s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.641108817 +0000 UTC m=+237.422259526" watchObservedRunningTime="2026-03-11 09:17:49.642343693 +0000 UTC m=+237.423494382" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.662823 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" event={"ID":"eacc4dea-3b3a-47c2-9c1e-f8392a7509bf","Type":"ContainerStarted","Data":"6b6e0040ae63876b829c34b7dfba2e11a9b53a088e97831a3e2aa18d34b14179"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.694604 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" podStartSLOduration=166.694586487 podStartE2EDuration="2m46.694586487s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.69162469 +0000 UTC m=+237.472775379" watchObservedRunningTime="2026-03-11 09:17:49.694586487 +0000 UTC m=+237.475737176" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.700296 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.701296 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.201283901 +0000 UTC m=+237.982434590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.708139 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" event={"ID":"0da359c8-ec5a-4e8b-9632-7b5aba7856d0","Type":"ContainerStarted","Data":"7f130685ecafac8fe2b88fe472585b28adf1eb30b86ceceae62c32c02f8f4800"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.708188 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" event={"ID":"0da359c8-ec5a-4e8b-9632-7b5aba7856d0","Type":"ContainerStarted","Data":"7fa674a161a81be5747b432f7324b7a92e251372d3240e3ae6bfbb6c6354100c"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.717751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" event={"ID":"34b4e37c-7920-4056-b0f9-38606804f021","Type":"ContainerStarted","Data":"5092e5c4274718343e6987949ea9fb37f65cd292c5032012b8956629a018d660"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.720300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" event={"ID":"31e79cbc-4d50-4925-9b5b-7a5012de13a0","Type":"ContainerStarted","Data":"e7436a4b98421745c69ba4c47f61048c10a0efa7a489c4b1380fa7d6115de576"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.732531 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" event={"ID":"cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c","Type":"ContainerStarted","Data":"79d0039b3950b7ac1396dd11593e70d4d627368241766649566df7ed99d0eaf0"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.741545 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553676-chghx" event={"ID":"680233cf-fda8-402e-95a6-a596a0edd470","Type":"ContainerStarted","Data":"eb33a8b76e0b36f1440e218b0e3689d921e460afff0ca753982e13615ac329c3"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.783778 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wdpdj" podStartSLOduration=167.783759956 podStartE2EDuration="2m47.783759956s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.758670864 +0000 UTC m=+237.539821553" watchObservedRunningTime="2026-03-11 09:17:49.783759956 +0000 UTC m=+237.564910645" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.784303 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-tqscv" podStartSLOduration=166.784297962 podStartE2EDuration="2m46.784297962s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.780699597 +0000 UTC m=+237.561850286" watchObservedRunningTime="2026-03-11 09:17:49.784297962 +0000 UTC m=+237.565448651" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.802848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.803116 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.30310539 +0000 UTC m=+238.084256069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.809970 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" event={"ID":"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b","Type":"ContainerStarted","Data":"a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.810030 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" event={"ID":"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b","Type":"ContainerStarted","Data":"c7c67ffc0c8ddee7af0e860db00032ead86c1a9d55218cc921b80a1bc504ccf7"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.811332 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.817733 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-j29w2" event={"ID":"dc054b59-4478-4188-a9a5-12bb26b68c96","Type":"ContainerStarted","Data":"f9a51abc179d7564d1d5f1c81a5236d2f415a126a924cec2ae057ffcf676db48"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.843510 4830 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-l6dtx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.843551 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" podUID="23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.867188 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" podStartSLOduration=167.867171468 podStartE2EDuration="2m47.867171468s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.866798258 +0000 UTC m=+237.647948957" watchObservedRunningTime="2026-03-11 09:17:49.867171468 +0000 UTC m=+237.648322167" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.901795 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" event={"ID":"87e5c483-a4bb-46b6-add5-1d1638cbbbab","Type":"ContainerStarted","Data":"184d976ae95bcf7314a4ec2a15612dbbbc8c96b28d0d83772d3006397aa58df3"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.901848 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" event={"ID":"87e5c483-a4bb-46b6-add5-1d1638cbbbab","Type":"ContainerStarted","Data":"03845ec8ccacf620da9aaebc6541fe8cfeccf2f232fe596aa25f8cac2a86dc52"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.904849 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:49 crc kubenswrapper[4830]: E0311 09:17:49.906574 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.406558547 +0000 UTC m=+238.187709236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.920700 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-j29w2" podStartSLOduration=166.920674498 podStartE2EDuration="2m46.920674498s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.901791618 +0000 UTC m=+237.682942317" watchObservedRunningTime="2026-03-11 09:17:49.920674498 +0000 UTC m=+237.701825187" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.937579 4830 generic.go:334] "Generic (PLEG): container finished" podID="61f0fae5-d6d9-4f7b-b163-72316a316a37" containerID="6eb7da745106dad191c039db3e94c78ecf5085b58d34b43306487fcc3357c332" exitCode=0 Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.938524 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" event={"ID":"61f0fae5-d6d9-4f7b-b163-72316a316a37","Type":"ContainerDied","Data":"6eb7da745106dad191c039db3e94c78ecf5085b58d34b43306487fcc3357c332"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.970149 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-hhjh9" podStartSLOduration=167.970121129 podStartE2EDuration="2m47.970121129s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:49.938534839 +0000 UTC m=+237.719685538" watchObservedRunningTime="2026-03-11 09:17:49.970121129 +0000 UTC m=+237.751271818" Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.987096 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" event={"ID":"ea88a306-e701-4a49-b4d2-7c4b62372c06","Type":"ContainerStarted","Data":"4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b"} Mar 11 09:17:49 crc kubenswrapper[4830]: I0311 09:17:49.988403 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.010244 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.011407 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.511395673 +0000 UTC m=+238.292546362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.061205 4830 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-k86x2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.061267 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" podUID="ea88a306-e701-4a49-b4d2-7c4b62372c06" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.064911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" event={"ID":"fd48e28d-bcdd-4bba-a540-0213cda9599a","Type":"ContainerStarted","Data":"0487dc58ae92bb233ad3abcc092c951d4881db96aa0ea1fc4f126478e67d8518"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.064947 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" event={"ID":"fd48e28d-bcdd-4bba-a540-0213cda9599a","Type":"ContainerStarted","Data":"d508f3b5c28fbc1183785714cfe711d60ab667fe5100e5ce16253230921ff692"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.083567 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qn6h2" event={"ID":"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab","Type":"ContainerStarted","Data":"648c37bcf0a1f181b273fbde29b3c1f1ae21d6c8208610db5a421d3a670c34cc"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.085892 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" event={"ID":"d0a7f311-e393-45f7-9b05-28590202d310","Type":"ContainerStarted","Data":"f8305915aa9f062755913e0bad44579499cb302b106d5038faeb000ddd59c82b"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.091980 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" podStartSLOduration=168.091958593 podStartE2EDuration="2m48.091958593s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.031431927 +0000 UTC m=+237.812582656" watchObservedRunningTime="2026-03-11 09:17:50.091958593 +0000 UTC m=+237.873109282" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.112058 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.113536 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.613519711 +0000 UTC m=+238.394670400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.126959 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" podStartSLOduration=167.126941032 podStartE2EDuration="2m47.126941032s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.095190547 +0000 UTC m=+237.876341236" watchObservedRunningTime="2026-03-11 09:17:50.126941032 +0000 UTC m=+237.908091721" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.127213 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-h7j2f" podStartSLOduration=167.127209151 podStartE2EDuration="2m47.127209151s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.126347665 +0000 UTC m=+237.907498364" watchObservedRunningTime="2026-03-11 09:17:50.127209151 +0000 UTC m=+237.908359840" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.161207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" event={"ID":"8377f033-b2c4-426e-bf41-3de032a37373","Type":"ContainerStarted","Data":"8340828c53e8526bd3e4bee7ebfbe9db3d76a7982a6f2a1ddcf2cac450ca22a2"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.177975 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lhp5r" event={"ID":"310d0571-8565-4c51-b88f-791faae59e94","Type":"ContainerStarted","Data":"26927c23dcac37c06f16be0e6e476720887fccafd034162b4268d19d28d8ee2e"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.178029 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lhp5r" event={"ID":"310d0571-8565-4c51-b88f-791faae59e94","Type":"ContainerStarted","Data":"2437bb0000f55728278012e4b84ec2ee1ac9d2b92d0eab169ee5c700276a58ce"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.186266 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" event={"ID":"02fc0c48-db54-4b8c-8642-287a4720251f","Type":"ContainerStarted","Data":"73e290b18f15a0f5f61763c55a07d9c0e64eed592c49f6bc0b7a56fa379f12bc"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.195910 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-jf97s" podStartSLOduration=168.195890393 podStartE2EDuration="2m48.195890393s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.193211285 +0000 UTC m=+237.974361974" watchObservedRunningTime="2026-03-11 09:17:50.195890393 +0000 UTC m=+237.977041082" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.200631 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57632: no serving certificate available for the kubelet" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.202586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k476p" event={"ID":"19e5e854-acf9-4b7e-9088-5c8a50ea0186","Type":"ContainerStarted","Data":"27d2669b043176ad7e7cd5213354ea4321483dc1e7df279ed459001a8b4e1a9e"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.214040 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.215256 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.715242657 +0000 UTC m=+238.496393346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.215906 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lhp5r" podStartSLOduration=6.215880746 podStartE2EDuration="6.215880746s" podCreationTimestamp="2026-03-11 09:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.213669261 +0000 UTC m=+237.994819950" watchObservedRunningTime="2026-03-11 09:17:50.215880746 +0000 UTC m=+237.997031435" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.255367 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-k476p" podStartSLOduration=6.255346876 podStartE2EDuration="6.255346876s" podCreationTimestamp="2026-03-11 09:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.240316678 +0000 UTC m=+238.021467367" watchObservedRunningTime="2026-03-11 09:17:50.255346876 +0000 UTC m=+238.036497565" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.274388 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" event={"ID":"49ec9e50-30cd-4028-bf0d-ac67afb7e344","Type":"ContainerStarted","Data":"71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.275220 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.307320 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" event={"ID":"c61d535e-afb5-4006-a758-8bba8735a860","Type":"ContainerStarted","Data":"e908d7bcbb7cc0c0a321bddd4f7137df0885dfd2db6cf98d4f3fc7e53b9125da"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.308100 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.319882 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.321103 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.821083143 +0000 UTC m=+238.602233832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.326159 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pplbq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.326203 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.333067 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" event={"ID":"bc1c56ec-23e5-4b04-87c3-fbfb6775c07d","Type":"ContainerStarted","Data":"c49a4a2cb8604e8b2c5bd4bd208ac4b77ba1398122e8f37d32252b4131177c26"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.341966 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" event={"ID":"0584184a-1f77-477d-8988-df9f60c5b194","Type":"ContainerStarted","Data":"dcc0d31700c6a2fce1cb3b5a001581318fe0e1c12ced93f67f13bf5aa3707883"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.357798 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" podStartSLOduration=167.357779072 podStartE2EDuration="2m47.357779072s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.321960239 +0000 UTC m=+238.103110928" watchObservedRunningTime="2026-03-11 09:17:50.357779072 +0000 UTC m=+238.138929761" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.359796 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" podStartSLOduration=167.359788002 podStartE2EDuration="2m47.359788002s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.359611967 +0000 UTC m=+238.140762676" watchObservedRunningTime="2026-03-11 09:17:50.359788002 +0000 UTC m=+238.140938691" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.395631 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cznhz" podStartSLOduration=168.395613946 podStartE2EDuration="2m48.395613946s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.395487982 +0000 UTC m=+238.176638661" watchObservedRunningTime="2026-03-11 09:17:50.395613946 +0000 UTC m=+238.176764635" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.401958 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de" containerID="1de63111c08e786e889a0e3c9209b488668a34440861e23033bbaaf9fe8e8f25" exitCode=0 Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.402237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" event={"ID":"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de","Type":"ContainerDied","Data":"1de63111c08e786e889a0e3c9209b488668a34440861e23033bbaaf9fe8e8f25"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.414830 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-knlpj" podStartSLOduration=167.414815826 podStartE2EDuration="2m47.414815826s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.41462808 +0000 UTC m=+238.195778769" watchObservedRunningTime="2026-03-11 09:17:50.414815826 +0000 UTC m=+238.195966515" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.427395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.427806 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:50.927793104 +0000 UTC m=+238.708943793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.439795 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" event={"ID":"3ba20c1f-29f0-4784-8683-621c75daffb3","Type":"ContainerStarted","Data":"b960bb7ec86cdca2ea9b2558476e74ef9b24721652f1e25184f20575a65c5e40"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.455351 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" event={"ID":"efd0238e-2294-4b23-ab03-88e149c4a0c9","Type":"ContainerStarted","Data":"ef91b38f05c9233b18dc904599d163b9c05cf7a5f21254180b50c4a2ffe400b5"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.455385 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" event={"ID":"efd0238e-2294-4b23-ab03-88e149c4a0c9","Type":"ContainerStarted","Data":"170f3051fe18aa334f41a168cc283c9b22fc9e3225fdc1200b30194c1bebc7f9"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.461473 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57642: no serving certificate available for the kubelet" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.475330 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.491232 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" event={"ID":"b04656f2-1bad-494e-a8d9-9671fe42431e","Type":"ContainerStarted","Data":"1c488256a6b323075686eba3509dd439271ab2aefb3422ad2e22db2ad49f213a"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.508977 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" event={"ID":"e29d573c-753f-4d0d-8c55-3e2842703508","Type":"ContainerStarted","Data":"e7d90466e1e581cf4600805b93dbc677a9d57b79af3b9c38841087d0b5bb7f0e"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.509067 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.528096 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.529247 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.029231982 +0000 UTC m=+238.810382671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.532743 4830 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wbrh6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.532792 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" podUID="e29d573c-753f-4d0d-8c55-3e2842703508" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.537079 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" podStartSLOduration=168.53706461 podStartE2EDuration="2m48.53706461s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.536765571 +0000 UTC m=+238.317916260" watchObservedRunningTime="2026-03-11 09:17:50.53706461 +0000 UTC m=+238.318215299" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.557284 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" event={"ID":"81a2f08a-36b5-487c-8222-f303e055755f","Type":"ContainerStarted","Data":"804f6c64207c76a6eb5c309a2af0cefe9aecf92e0c47af7b2a33ba36b7afbd89"} Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.559003 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-cprgf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.559070 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cprgf" podUID="3569f4d9-01f3-47be-bc27-af6a7cb999a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.565824 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.578010 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:50 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:50 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:50 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.578120 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.591291 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57650: no serving certificate available for the kubelet" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.620122 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" podStartSLOduration=167.620103342 podStartE2EDuration="2m47.620103342s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.619339119 +0000 UTC m=+238.400489818" watchObservedRunningTime="2026-03-11 09:17:50.620103342 +0000 UTC m=+238.401254031" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.633172 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.637055 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.137036285 +0000 UTC m=+238.918186974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.672702 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" podStartSLOduration=167.672676354 podStartE2EDuration="2m47.672676354s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.664365161 +0000 UTC m=+238.445515870" watchObservedRunningTime="2026-03-11 09:17:50.672676354 +0000 UTC m=+238.453827043" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.731466 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" podStartSLOduration=167.731448968 podStartE2EDuration="2m47.731448968s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.721423045 +0000 UTC m=+238.502573744" watchObservedRunningTime="2026-03-11 09:17:50.731448968 +0000 UTC m=+238.512599657" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.735239 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.735765 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.235748013 +0000 UTC m=+239.016898712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.792115 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57654: no serving certificate available for the kubelet" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.841614 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.841903 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.341892118 +0000 UTC m=+239.123042807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.900600 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57658: no serving certificate available for the kubelet" Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.945241 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:50 crc kubenswrapper[4830]: E0311 09:17:50.945644 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.445629072 +0000 UTC m=+239.226779761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:50 crc kubenswrapper[4830]: I0311 09:17:50.992385 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57674: no serving certificate available for the kubelet" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.046321 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.046648 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.546635527 +0000 UTC m=+239.327786216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.084554 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57688: no serving certificate available for the kubelet" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.147525 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.147724 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.647695795 +0000 UTC m=+239.428846504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.147789 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.148246 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.64822797 +0000 UTC m=+239.429378659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.209940 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57692: no serving certificate available for the kubelet" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.248974 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.249170 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.749144702 +0000 UTC m=+239.530295391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.249424 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.249748 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.749735659 +0000 UTC m=+239.530886348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.275333 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" podStartSLOduration=169.275315896 podStartE2EDuration="2m49.275315896s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:50.764045689 +0000 UTC m=+238.545196378" watchObservedRunningTime="2026-03-11 09:17:51.275315896 +0000 UTC m=+239.056466575" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.277872 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6dtx"] Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.299384 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9"] Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.351200 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.351399 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.851377753 +0000 UTC m=+239.632528442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.351622 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.351957 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.851946219 +0000 UTC m=+239.633096908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.453310 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.453450 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.953433419 +0000 UTC m=+239.734584108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.453720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.454031 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:51.954008395 +0000 UTC m=+239.735159084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.554457 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.554635 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.054609069 +0000 UTC m=+239.835759758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.554820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.555153 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.055145745 +0000 UTC m=+239.836296424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.569242 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" event={"ID":"dec76324-c87e-445a-9236-a674976e81dc","Type":"ContainerStarted","Data":"00e79041cf3ed1678a5cb374084c82fc0dabdf98870d76be2b012b5b1102e535"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.569293 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" event={"ID":"dec76324-c87e-445a-9236-a674976e81dc","Type":"ContainerStarted","Data":"edc98d4307c2876984adbf8dc744a62adf029f77f5973e53337a796585b512c5"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.571055 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" event={"ID":"0da359c8-ec5a-4e8b-9632-7b5aba7856d0","Type":"ContainerStarted","Data":"6f80a4ededa76d52f8e0b3b9f97a99b26be3e32511bb1ed5243345e1b47a0e84"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.572473 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" event={"ID":"3ba20c1f-29f0-4784-8683-621c75daffb3","Type":"ContainerStarted","Data":"9f0d27d17925cfced963b4b15da24e11586979e450ac25adb83ab6ad35baaa83"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.572495 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-k8b27" event={"ID":"3ba20c1f-29f0-4784-8683-621c75daffb3","Type":"ContainerStarted","Data":"165bf2f5e1391030acc390ebfca6192a0ee42017501219b2dbfd222e08bdbf8b"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.577000 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:51 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:51 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:51 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.577065 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.578502 4830 generic.go:334] "Generic (PLEG): container finished" podID="efd0238e-2294-4b23-ab03-88e149c4a0c9" containerID="ef91b38f05c9233b18dc904599d163b9c05cf7a5f21254180b50c4a2ffe400b5" exitCode=0 Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.578540 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" event={"ID":"efd0238e-2294-4b23-ab03-88e149c4a0c9","Type":"ContainerDied","Data":"ef91b38f05c9233b18dc904599d163b9c05cf7a5f21254180b50c4a2ffe400b5"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.584814 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nfnzk" event={"ID":"84392623-69c8-4e0c-a3bf-59a57b487494","Type":"ContainerStarted","Data":"08b052e765f999919db158424e20b65bdd57b602c738700b87d332aa1d4464d8"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.586758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" event={"ID":"31e79cbc-4d50-4925-9b5b-7a5012de13a0","Type":"ContainerStarted","Data":"b63377a859b089fec928c11c8025723aa32bd1cd799acee4f9c5925d4774dd0a"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.586783 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" event={"ID":"31e79cbc-4d50-4925-9b5b-7a5012de13a0","Type":"ContainerStarted","Data":"3a95c0b4c52f905ee1aa91a441ec3f6d5e2e6242d7f5dbb13955535ad441caaa"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.589005 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qn6h2" event={"ID":"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab","Type":"ContainerStarted","Data":"b66427fd223d973388089c78678d462d072ff7bc6a67a21550d88aac1be9e44a"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.589092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qn6h2" event={"ID":"2c8dd96b-cc53-49ce-9f64-ec26e87c62ab","Type":"ContainerStarted","Data":"e22662066bbcc41498257fa4c6bc527e02e0656a736e4709e554cd05c84d7fa4"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.589131 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qn6h2" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.593048 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-sbhg5" podStartSLOduration=168.59303223 podStartE2EDuration="2m48.59303223s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:51.590813134 +0000 UTC m=+239.371963833" watchObservedRunningTime="2026-03-11 09:17:51.59303223 +0000 UTC m=+239.374182909" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.597849 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" event={"ID":"4704aedc-31a3-4890-a1f9-1fc6533caae0","Type":"ContainerStarted","Data":"fa04d03250e6b71f346f61f955908f42712dd12889e7bb9c6ac0b8f0e2575423"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.603629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" event={"ID":"61f0fae5-d6d9-4f7b-b163-72316a316a37","Type":"ContainerStarted","Data":"52293eed84a221b22d88a5f809eac5ae14616486abaf52deb1a8f0b74f59490b"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.603755 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.610324 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" event={"ID":"e29d573c-753f-4d0d-8c55-3e2842703508","Type":"ContainerStarted","Data":"36b1c6caccee54937e275416c77e0601e0ae1c8c0d8b4f2614de9e770bc2bca0"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.622299 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wbrh6" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.623284 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-426g8" event={"ID":"81a2f08a-36b5-487c-8222-f303e055755f","Type":"ContainerStarted","Data":"8ffbcc79d13ef06bd4871cfe57d16ca844573f93bdba422bb0868354cdac9e8e"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.640548 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-s8cnh" event={"ID":"fd48e28d-bcdd-4bba-a540-0213cda9599a","Type":"ContainerStarted","Data":"5e7697e2ce66113a4f7610abe20f9017caaee8477daa5eaaa71720fadea7fc10"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.655479 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.657508 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.157491059 +0000 UTC m=+239.938641748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.664303 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jr5k9" podStartSLOduration=169.664281586 podStartE2EDuration="2m49.664281586s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:51.628882054 +0000 UTC m=+239.410032763" watchObservedRunningTime="2026-03-11 09:17:51.664281586 +0000 UTC m=+239.445432275" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.670263 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-k476p" event={"ID":"19e5e854-acf9-4b7e-9088-5c8a50ea0186","Type":"ContainerStarted","Data":"4c9e2364f08f94fb0da1aebd2ead9cc9f7bb84ae339a64d62ca4429ffb525765"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.688891 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6hzgd" podStartSLOduration=168.688875184 podStartE2EDuration="2m48.688875184s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:51.688222785 +0000 UTC m=+239.469373474" watchObservedRunningTime="2026-03-11 09:17:51.688875184 +0000 UTC m=+239.470025873" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.697370 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" event={"ID":"80895ee9-5c6f-4e2d-b7c0-88b50d57a720","Type":"ContainerStarted","Data":"7534db9761f57924778c696a9453c2e0ce3e6875d4928bc63f748145a51f9c4d"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.698202 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.700377 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" event={"ID":"bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de","Type":"ContainerStarted","Data":"f57e78e039d967158f72e9179838509f296d214920223df0a9211c69085ff365"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.713634 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hnjgz" event={"ID":"199126b4-f145-46ee-9018-ab173f1267f7","Type":"ContainerStarted","Data":"a2349fe94c69698b1a9f84bacffe2fa789012cd61fc545f29b4ccb1a423ba1fe"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.714462 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.723569 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" event={"ID":"02fc0c48-db54-4b8c-8642-287a4720251f","Type":"ContainerStarted","Data":"5e64555a4b287f3104e54a57b63eeefe9486d16a79723783d4fed066f18cd990"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.723624 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" event={"ID":"02fc0c48-db54-4b8c-8642-287a4720251f","Type":"ContainerStarted","Data":"9b9fd6ae6a34a63e98bed7402f54eedc56c1f750d722cfebaebf55b4c5cebf5c"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.724051 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.724152 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.731910 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qn6h2" podStartSLOduration=7.731895748 podStartE2EDuration="7.731895748s" podCreationTimestamp="2026-03-11 09:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:51.7312943 +0000 UTC m=+239.512444989" watchObservedRunningTime="2026-03-11 09:17:51.731895748 +0000 UTC m=+239.513046437" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.756113 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" event={"ID":"0a49527e-c963-4f2c-8e8c-5f2a879ac281","Type":"ContainerStarted","Data":"79679788c2bbee8f67b156603f8ff898cfbda8389ef16e1dbb8f65ed0a0812d4"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.762829 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.765625 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.265607901 +0000 UTC m=+240.046758640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.784309 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" event={"ID":"34b4e37c-7920-4056-b0f9-38606804f021","Type":"ContainerStarted","Data":"05571c3c4f87c6db63b3991dc6ca83ff1316dd3f28e3d7f2a7801a946cc609e1"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.801646 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-9r6tk" event={"ID":"b04656f2-1bad-494e-a8d9-9671fe42431e","Type":"ContainerStarted","Data":"ce7db57af63b300f79c95a6d585d01b97bbf3fb839bcaa83f8418225145e3f45"} Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.806211 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pplbq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.806266 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.816665 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.828696 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.844431 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" podStartSLOduration=169.844411029 podStartE2EDuration="2m49.844411029s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:51.81324803 +0000 UTC m=+239.594398719" watchObservedRunningTime="2026-03-11 09:17:51.844411029 +0000 UTC m=+239.625561718" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.863833 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.869278 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.369226302 +0000 UTC m=+240.150376991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.873073 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.879307 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.879321 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.379297306 +0000 UTC m=+240.160447995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.880517 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.882460 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bdrq8" podStartSLOduration=168.882443487 podStartE2EDuration="2m48.882443487s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:51.843860393 +0000 UTC m=+239.625011092" watchObservedRunningTime="2026-03-11 09:17:51.882443487 +0000 UTC m=+239.663594176" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.899327 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x98hk" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.908497 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hnjgz" podStartSLOduration=169.908442106 podStartE2EDuration="2m49.908442106s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:51.9062125 +0000 UTC m=+239.687363189" watchObservedRunningTime="2026-03-11 09:17:51.908442106 +0000 UTC m=+239.689592785" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.946210 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.946515 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.964275 4830 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-ffkfl container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.964382 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" podUID="bb3af6c7-fdcc-4b1a-bedf-ccfcb0e973de" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.973570 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57698: no serving certificate available for the kubelet" Mar 11 09:17:51 crc kubenswrapper[4830]: I0311 09:17:51.984751 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:51 crc kubenswrapper[4830]: E0311 09:17:51.985310 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.485287107 +0000 UTC m=+240.266437796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.088695 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.089009 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.58899811 +0000 UTC m=+240.370148799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.108606 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6j6fc" podStartSLOduration=169.108586082 podStartE2EDuration="2m49.108586082s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:52.008505493 +0000 UTC m=+239.789656172" watchObservedRunningTime="2026-03-11 09:17:52.108586082 +0000 UTC m=+239.889736771" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.109759 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" podStartSLOduration=169.109753666 podStartE2EDuration="2m49.109753666s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:52.108161168 +0000 UTC m=+239.889311857" watchObservedRunningTime="2026-03-11 09:17:52.109753666 +0000 UTC m=+239.890904355" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.189582 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.189775 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.689748478 +0000 UTC m=+240.470899167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.189817 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.190283 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.690271833 +0000 UTC m=+240.471422522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.247140 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" podStartSLOduration=169.247126351 podStartE2EDuration="2m49.247126351s" podCreationTimestamp="2026-03-11 09:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:52.244514785 +0000 UTC m=+240.025665494" watchObservedRunningTime="2026-03-11 09:17:52.247126351 +0000 UTC m=+240.028277040" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.293528 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.293817 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.793803282 +0000 UTC m=+240.574953971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.394843 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.395180 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.895157817 +0000 UTC m=+240.676308506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.450297 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" podStartSLOduration=170.450279794 podStartE2EDuration="2m50.450279794s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:52.419900028 +0000 UTC m=+240.201050727" watchObservedRunningTime="2026-03-11 09:17:52.450279794 +0000 UTC m=+240.231430483" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.495893 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.496097 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.99607078 +0000 UTC m=+240.777221469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.496168 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.496468 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:52.996456451 +0000 UTC m=+240.777607140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.549604 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hnjgz" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.568092 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:52 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:52 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:52 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.568160 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.597455 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.597555 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.097540698 +0000 UTC m=+240.878691387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.597863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.598137 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.098123274 +0000 UTC m=+240.879273963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.621305 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6st5"] Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.622163 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.626345 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.698680 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.698865 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.198827681 +0000 UTC m=+240.979978370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.699028 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-utilities\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.699115 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.699200 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-catalog-content\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.699257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmw9f\" (UniqueName: \"kubernetes.io/projected/b6155028-4ba3-48be-b83d-7bbe65f28ba7-kube-api-access-qmw9f\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.699459 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.199446539 +0000 UTC m=+240.980597228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.706824 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6st5"] Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.800486 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.800697 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.300679221 +0000 UTC m=+241.081829910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.800753 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-utilities\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.800807 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.800844 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-catalog-content\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.800872 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmw9f\" (UniqueName: \"kubernetes.io/projected/b6155028-4ba3-48be-b83d-7bbe65f28ba7-kube-api-access-qmw9f\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.801178 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-utilities\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.801215 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.301199865 +0000 UTC m=+241.082350554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.801468 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-catalog-content\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.807521 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" event={"ID":"34b4e37c-7920-4056-b0f9-38606804f021","Type":"ContainerStarted","Data":"a5a3a9e31a26e9dd0ca9b9951c254e8c06b7b6760b08fbe9ba091b27b870691e"} Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.810526 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" podUID="49ec9e50-30cd-4028-bf0d-ac67afb7e344" containerName="route-controller-manager" containerID="cri-o://71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953" gracePeriod=30 Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.811687 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" podUID="23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" containerName="controller-manager" containerID="cri-o://a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8" gracePeriod=30 Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.812321 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.834978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmw9f\" (UniqueName: \"kubernetes.io/projected/b6155028-4ba3-48be-b83d-7bbe65f28ba7-kube-api-access-qmw9f\") pod \"community-operators-m6st5\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.918797 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.919310 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.419286889 +0000 UTC m=+241.200437598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:52 crc kubenswrapper[4830]: I0311 09:17:52.937546 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:52 crc kubenswrapper[4830]: E0311 09:17:52.945945 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.445910525 +0000 UTC m=+241.227061214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.005237 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.007112 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.019671 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vvk9l"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.020633 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.052346 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvk9l"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.083232 4830 patch_prober.go:28] interesting pod/apiserver-76f77b778f-r4nnq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]log ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]etcd ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/generic-apiserver-start-informers ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/max-in-flight-filter ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 11 09:17:53 crc kubenswrapper[4830]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 11 09:17:53 crc kubenswrapper[4830]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/project.openshift.io-projectcache ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-startinformers ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 11 09:17:53 crc kubenswrapper[4830]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 11 09:17:53 crc kubenswrapper[4830]: livez check failed Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.083282 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" podUID="0a49527e-c963-4f2c-8e8c-5f2a879ac281" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.100683 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.100988 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-catalog-content\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.101041 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-utilities\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.101079 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cnkj\" (UniqueName: \"kubernetes.io/projected/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-kube-api-access-8cnkj\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.101239 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.601212433 +0000 UTC m=+241.382363122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.194281 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x9zpp"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.195454 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.202105 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cnkj\" (UniqueName: \"kubernetes.io/projected/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-kube-api-access-8cnkj\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.202170 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.202223 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-catalog-content\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.202253 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-utilities\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.202643 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-utilities\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.203132 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.703120935 +0000 UTC m=+241.484271624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.204068 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.207136 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x9zpp"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.209104 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-catalog-content\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.240208 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cnkj\" (UniqueName: \"kubernetes.io/projected/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-kube-api-access-8cnkj\") pod \"community-operators-vvk9l\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.303959 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.304098 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.804072298 +0000 UTC m=+241.585222977 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.304150 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-utilities\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.304231 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhjg4\" (UniqueName: \"kubernetes.io/projected/7febf059-370d-4a68-a543-3b23879ba479-kube-api-access-vhjg4\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.304386 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.304418 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-catalog-content\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.304733 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.804726687 +0000 UTC m=+241.585877376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.338439 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57702: no serving certificate available for the kubelet" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.355426 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.393971 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbnft"] Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.394638 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd0238e-2294-4b23-ab03-88e149c4a0c9" containerName="collect-profiles" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.394651 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd0238e-2294-4b23-ab03-88e149c4a0c9" containerName="collect-profiles" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.395662 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd0238e-2294-4b23-ab03-88e149c4a0c9" containerName="collect-profiles" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.397293 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.412511 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sj5g\" (UniqueName: \"kubernetes.io/projected/efd0238e-2294-4b23-ab03-88e149c4a0c9-kube-api-access-2sj5g\") pod \"efd0238e-2294-4b23-ab03-88e149c4a0c9\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.413980 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.414506 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.414826 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efd0238e-2294-4b23-ab03-88e149c4a0c9-secret-volume\") pod \"efd0238e-2294-4b23-ab03-88e149c4a0c9\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.414906 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efd0238e-2294-4b23-ab03-88e149c4a0c9-config-volume\") pod \"efd0238e-2294-4b23-ab03-88e149c4a0c9\" (UID: \"efd0238e-2294-4b23-ab03-88e149c4a0c9\") " Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.414926 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.91490578 +0000 UTC m=+241.696056469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.415126 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-utilities\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.415168 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhjg4\" (UniqueName: \"kubernetes.io/projected/7febf059-370d-4a68-a543-3b23879ba479-kube-api-access-vhjg4\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.415197 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbnft"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.415242 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.415263 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-catalog-content\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.415812 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:53.915801485 +0000 UTC m=+241.696952174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.416256 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-catalog-content\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.416631 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efd0238e-2294-4b23-ab03-88e149c4a0c9-config-volume" (OuterVolumeSpecName: "config-volume") pod "efd0238e-2294-4b23-ab03-88e149c4a0c9" (UID: "efd0238e-2294-4b23-ab03-88e149c4a0c9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.425667 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd0238e-2294-4b23-ab03-88e149c4a0c9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "efd0238e-2294-4b23-ab03-88e149c4a0c9" (UID: "efd0238e-2294-4b23-ab03-88e149c4a0c9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.425924 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-utilities\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.436828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd0238e-2294-4b23-ab03-88e149c4a0c9-kube-api-access-2sj5g" (OuterVolumeSpecName: "kube-api-access-2sj5g") pod "efd0238e-2294-4b23-ab03-88e149c4a0c9" (UID: "efd0238e-2294-4b23-ab03-88e149c4a0c9"). InnerVolumeSpecName "kube-api-access-2sj5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.445906 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhjg4\" (UniqueName: \"kubernetes.io/projected/7febf059-370d-4a68-a543-3b23879ba479-kube-api-access-vhjg4\") pod \"certified-operators-x9zpp\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.518642 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.518834 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.018809849 +0000 UTC m=+241.799960538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.518878 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-utilities\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.519241 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-catalog-content\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.519388 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.519453 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhwgx\" (UniqueName: \"kubernetes.io/projected/ebefef77-9e3b-45d5-8301-53df1b75c9bb-kube-api-access-zhwgx\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.519507 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sj5g\" (UniqueName: \"kubernetes.io/projected/efd0238e-2294-4b23-ab03-88e149c4a0c9-kube-api-access-2sj5g\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.519524 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/efd0238e-2294-4b23-ab03-88e149c4a0c9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.519534 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efd0238e-2294-4b23-ab03-88e149c4a0c9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.519714 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.019695835 +0000 UTC m=+241.800846524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.521825 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.551908 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.567117 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:53 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:53 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:53 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.567167 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.599038 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620091 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-serving-cert\") pod \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620214 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620253 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-proxy-ca-bundles\") pod \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620309 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjprf\" (UniqueName: \"kubernetes.io/projected/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-kube-api-access-qjprf\") pod \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620341 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-config\") pod \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620361 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-client-ca\") pod \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\" (UID: \"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620480 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-utilities\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620503 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-catalog-content\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.620610 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhwgx\" (UniqueName: \"kubernetes.io/projected/ebefef77-9e3b-45d5-8301-53df1b75c9bb-kube-api-access-zhwgx\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.621191 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" (UID: "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.622265 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-client-ca" (OuterVolumeSpecName: "client-ca") pod "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" (UID: "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.622842 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-utilities\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.622872 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-config" (OuterVolumeSpecName: "config") pod "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" (UID: "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.622953 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-catalog-content\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.623078 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.123041938 +0000 UTC m=+241.904192627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.623296 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6st5"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.626091 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" (UID: "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.627895 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-kube-api-access-qjprf" (OuterVolumeSpecName: "kube-api-access-qjprf") pod "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" (UID: "23fd7ddf-f544-4e6c-bcb6-1f1805c6570b"). InnerVolumeSpecName "kube-api-access-qjprf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: W0311 09:17:53.636392 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6155028_4ba3_48be_b83d_7bbe65f28ba7.slice/crio-2b591f42d4c33f8d90a60ff1d77e715510852fd9db49ed1d2251b92629b1a778 WatchSource:0}: Error finding container 2b591f42d4c33f8d90a60ff1d77e715510852fd9db49ed1d2251b92629b1a778: Status 404 returned error can't find the container with id 2b591f42d4c33f8d90a60ff1d77e715510852fd9db49ed1d2251b92629b1a778 Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.641948 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhwgx\" (UniqueName: \"kubernetes.io/projected/ebefef77-9e3b-45d5-8301-53df1b75c9bb-kube-api-access-zhwgx\") pod \"certified-operators-wbnft\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.721666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-client-ca\") pod \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722330 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ec9e50-30cd-4028-bf0d-ac67afb7e344-serving-cert\") pod \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722395 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-config\") pod \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722419 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zhcc\" (UniqueName: \"kubernetes.io/projected/49ec9e50-30cd-4028-bf0d-ac67afb7e344-kube-api-access-5zhcc\") pod \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\" (UID: \"49ec9e50-30cd-4028-bf0d-ac67afb7e344\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722580 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722696 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722712 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjprf\" (UniqueName: \"kubernetes.io/projected/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-kube-api-access-qjprf\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722723 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722732 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.722740 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.722966 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.222955611 +0000 UTC m=+242.004106290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.723912 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-config" (OuterVolumeSpecName: "config") pod "49ec9e50-30cd-4028-bf0d-ac67afb7e344" (UID: "49ec9e50-30cd-4028-bf0d-ac67afb7e344"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.724487 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-client-ca" (OuterVolumeSpecName: "client-ca") pod "49ec9e50-30cd-4028-bf0d-ac67afb7e344" (UID: "49ec9e50-30cd-4028-bf0d-ac67afb7e344"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.728683 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ec9e50-30cd-4028-bf0d-ac67afb7e344-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "49ec9e50-30cd-4028-bf0d-ac67afb7e344" (UID: "49ec9e50-30cd-4028-bf0d-ac67afb7e344"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.728925 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ec9e50-30cd-4028-bf0d-ac67afb7e344-kube-api-access-5zhcc" (OuterVolumeSpecName: "kube-api-access-5zhcc") pod "49ec9e50-30cd-4028-bf0d-ac67afb7e344" (UID: "49ec9e50-30cd-4028-bf0d-ac67afb7e344"). InnerVolumeSpecName "kube-api-access-5zhcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.778491 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.823666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.823973 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49ec9e50-30cd-4028-bf0d-ac67afb7e344-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.823985 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.823995 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zhcc\" (UniqueName: \"kubernetes.io/projected/49ec9e50-30cd-4028-bf0d-ac67afb7e344-kube-api-access-5zhcc\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.824003 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49ec9e50-30cd-4028-bf0d-ac67afb7e344-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.825661 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.325642785 +0000 UTC m=+242.106793474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.846558 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" event={"ID":"34b4e37c-7920-4056-b0f9-38606804f021","Type":"ContainerStarted","Data":"d367e0d5fd9e0a8ab76ab85387ae555f688997756d6861675f0698d2314d421e"} Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.852939 4830 generic.go:334] "Generic (PLEG): container finished" podID="23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" containerID="a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8" exitCode=0 Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.853001 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" event={"ID":"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b","Type":"ContainerDied","Data":"a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8"} Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.853028 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.853050 4830 scope.go:117] "RemoveContainer" containerID="a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.853038 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-l6dtx" event={"ID":"23fd7ddf-f544-4e6c-bcb6-1f1805c6570b","Type":"ContainerDied","Data":"c7c67ffc0c8ddee7af0e860db00032ead86c1a9d55218cc921b80a1bc504ccf7"} Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.856055 4830 generic.go:334] "Generic (PLEG): container finished" podID="49ec9e50-30cd-4028-bf0d-ac67afb7e344" containerID="71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953" exitCode=0 Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.856142 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" event={"ID":"49ec9e50-30cd-4028-bf0d-ac67afb7e344","Type":"ContainerDied","Data":"71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953"} Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.856260 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" event={"ID":"49ec9e50-30cd-4028-bf0d-ac67afb7e344","Type":"ContainerDied","Data":"d7f71751d8b7f3b6a96be88417311b8d2813f6ba170427c81386e29fb762e615"} Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.856157 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.873385 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6st5" event={"ID":"b6155028-4ba3-48be-b83d-7bbe65f28ba7","Type":"ContainerStarted","Data":"2b591f42d4c33f8d90a60ff1d77e715510852fd9db49ed1d2251b92629b1a778"} Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.884467 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" event={"ID":"efd0238e-2294-4b23-ab03-88e149c4a0c9","Type":"ContainerDied","Data":"170f3051fe18aa334f41a168cc283c9b22fc9e3225fdc1200b30194c1bebc7f9"} Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.884526 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="170f3051fe18aa334f41a168cc283c9b22fc9e3225fdc1200b30194c1bebc7f9" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.885083 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.894990 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvk9l"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.935512 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.936604 4830 scope.go:117] "RemoveContainer" containerID="a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8" Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.937032 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.437002623 +0000 UTC m=+242.218153312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:53 crc kubenswrapper[4830]: E0311 09:17:53.937548 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8\": container with ID starting with a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8 not found: ID does not exist" containerID="a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.937642 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8"} err="failed to get container status \"a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8\": rpc error: code = NotFound desc = could not find container \"a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8\": container with ID starting with a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8 not found: ID does not exist" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.937670 4830 scope.go:117] "RemoveContainer" containerID="71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953" Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.960748 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6dtx"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.965370 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-l6dtx"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.980490 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9"] Mar 11 09:17:53 crc kubenswrapper[4830]: I0311 09:17:53.983794 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-xm7n9"] Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.008896 4830 scope.go:117] "RemoveContainer" containerID="71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.009342 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953\": container with ID starting with 71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953 not found: ID does not exist" containerID="71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.009410 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953"} err="failed to get container status \"71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953\": rpc error: code = NotFound desc = could not find container \"71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953\": container with ID starting with 71516b0bc29dfc8a3b14f5ba09c2da120340f89f598e54beae3941511149a953 not found: ID does not exist" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.038373 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.038620 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.538584494 +0000 UTC m=+242.319735193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.038857 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.039560 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.539522822 +0000 UTC m=+242.320673511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.058327 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbnft"] Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.139861 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.140087 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.640059493 +0000 UTC m=+242.421210182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.140222 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.140594 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.640580338 +0000 UTC m=+242.421731027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.164660 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x9zpp"] Mar 11 09:17:54 crc kubenswrapper[4830]: W0311 09:17:54.184417 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7febf059_370d_4a68_a543_3b23879ba479.slice/crio-3a0863179e12d91cb364b7003106964135055c075bb0a2a4790b50b498f254f3 WatchSource:0}: Error finding container 3a0863179e12d91cb364b7003106964135055c075bb0a2a4790b50b498f254f3: Status 404 returned error can't find the container with id 3a0863179e12d91cb364b7003106964135055c075bb0a2a4790b50b498f254f3 Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.185366 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-conmon-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c6ba43_1cb8_442f_a5f2_b7bac6ed1c66.slice/crio-conmon-aa812bf6f24f9b54d10d6274e6fec320b59f6ea957d66252224ff0cf8039d3d1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c6ba43_1cb8_442f_a5f2_b7bac6ed1c66.slice/crio-aa812bf6f24f9b54d10d6274e6fec320b59f6ea957d66252224ff0cf8039d3d1.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.241783 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.242009 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.741983484 +0000 UTC m=+242.523134173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.242096 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.242588 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.742579632 +0000 UTC m=+242.523730321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.344865 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.345209 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.845092321 +0000 UTC m=+242.626243010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.345735 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.346063 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.846047219 +0000 UTC m=+242.627197908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.447062 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.447464 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:54.947443565 +0000 UTC m=+242.728594254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.548487 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.548865 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:55.048854362 +0000 UTC m=+242.830005052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.565926 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:54 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:54 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:54 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.565984 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.649466 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.649959 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:55.14994075 +0000 UTC m=+242.931091439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.751658 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.752082 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:55.252069457 +0000 UTC m=+243.033220146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.763771 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.763984 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" containerName="controller-manager" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.763996 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" containerName="controller-manager" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.764022 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ec9e50-30cd-4028-bf0d-ac67afb7e344" containerName="route-controller-manager" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.764030 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ec9e50-30cd-4028-bf0d-ac67afb7e344" containerName="route-controller-manager" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.764127 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ec9e50-30cd-4028-bf0d-ac67afb7e344" containerName="route-controller-manager" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.764144 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" containerName="controller-manager" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.765189 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.766944 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.769265 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.773729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.852514 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.852727 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:55.352701072 +0000 UTC m=+243.133851771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.852802 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.852984 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.853075 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.853324 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:55.353311679 +0000 UTC m=+243.134462378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.891886 4830 generic.go:334] "Generic (PLEG): container finished" podID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerID="4e601c1281cd1b3adae8a51281dd286c73a2a6d816fc5c6cdbe081c13e8aab8a" exitCode=0 Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.891955 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6st5" event={"ID":"b6155028-4ba3-48be-b83d-7bbe65f28ba7","Type":"ContainerDied","Data":"4e601c1281cd1b3adae8a51281dd286c73a2a6d816fc5c6cdbe081c13e8aab8a"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.900262 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerID="aa812bf6f24f9b54d10d6274e6fec320b59f6ea957d66252224ff0cf8039d3d1" exitCode=0 Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.900338 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvk9l" event={"ID":"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66","Type":"ContainerDied","Data":"aa812bf6f24f9b54d10d6274e6fec320b59f6ea957d66252224ff0cf8039d3d1"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.900370 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvk9l" event={"ID":"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66","Type":"ContainerStarted","Data":"6f175126e09e5cf12dbcfd97ea672787cb3bfbe875baab793fd8b58265df3c80"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.901804 4830 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.907964 4830 generic.go:334] "Generic (PLEG): container finished" podID="7febf059-370d-4a68-a543-3b23879ba479" containerID="ba13d6dd3b18dc83f39343ace3841bc0c5cfd9957c62021f9e678ccae4cea5bb" exitCode=0 Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.908104 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9zpp" event={"ID":"7febf059-370d-4a68-a543-3b23879ba479","Type":"ContainerDied","Data":"ba13d6dd3b18dc83f39343ace3841bc0c5cfd9957c62021f9e678ccae4cea5bb"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.908131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9zpp" event={"ID":"7febf059-370d-4a68-a543-3b23879ba479","Type":"ContainerStarted","Data":"3a0863179e12d91cb364b7003106964135055c075bb0a2a4790b50b498f254f3"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.918201 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.919830 4830 generic.go:334] "Generic (PLEG): container finished" podID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerID="a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7" exitCode=0 Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.920142 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnft" event={"ID":"ebefef77-9e3b-45d5-8301-53df1b75c9bb","Type":"ContainerDied","Data":"a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.920173 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnft" event={"ID":"ebefef77-9e3b-45d5-8301-53df1b75c9bb","Type":"ContainerStarted","Data":"7f51526411ec0eb6ddcfa211e449953346c2a629a9b3d0a658a6a579cae72f1c"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.920248 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.921977 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.922107 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.929184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" event={"ID":"34b4e37c-7920-4056-b0f9-38606804f021","Type":"ContainerStarted","Data":"8d43fe9a7fe7eebda5407818ac7e5f98b51fc687ed5304d4d8cf12b7fdd62ba8"} Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.930759 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.944040 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fd7ddf-f544-4e6c-bcb6-1f1805c6570b" path="/var/lib/kubelet/pods/23fd7ddf-f544-4e6c-bcb6-1f1805c6570b/volumes" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.957227 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ec9e50-30cd-4028-bf0d-ac67afb7e344" path="/var/lib/kubelet/pods/49ec9e50-30cd-4028-bf0d-ac67afb7e344/volumes" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.969962 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.970376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.970496 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.970573 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:54 crc kubenswrapper[4830]: E0311 09:17:54.970658 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-11 09:17:55.47064404 +0000 UTC m=+243.251794729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:54 crc kubenswrapper[4830]: I0311 09:17:54.996770 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qgv"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.011948 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.013994 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rj9w8" podStartSLOduration=11.013978194 podStartE2EDuration="11.013978194s" podCreationTimestamp="2026-03-11 09:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:55.011630565 +0000 UTC m=+242.792781254" watchObservedRunningTime="2026-03-11 09:17:55.013978194 +0000 UTC m=+242.795128883" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.019831 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qgv"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.020081 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.021435 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.072164 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.072267 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.072329 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: E0311 09:17:55.073137 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-11 09:17:55.573121198 +0000 UTC m=+243.354271887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8dpw5" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.083706 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.164463 4830 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-11T09:17:54.901835924Z","Handler":null,"Name":""} Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.174648 4830 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.174692 4830 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.175479 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.175636 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-utilities\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.175678 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-catalog-content\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.175707 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.175790 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.175859 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dv6z\" (UniqueName: \"kubernetes.io/projected/44447137-e7c5-4a07-bcdd-6bcc4c835f79-kube-api-access-7dv6z\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.175887 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.178802 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.210576 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.238309 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.276891 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dv6z\" (UniqueName: \"kubernetes.io/projected/44447137-e7c5-4a07-bcdd-6bcc4c835f79-kube-api-access-7dv6z\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.276991 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.277108 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-utilities\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.277149 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-catalog-content\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.277975 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-catalog-content\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.278478 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-utilities\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.290169 4830 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.290227 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.294925 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dv6z\" (UniqueName: \"kubernetes.io/projected/44447137-e7c5-4a07-bcdd-6bcc4c835f79-kube-api-access-7dv6z\") pod \"redhat-marketplace-d5qgv\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.312863 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.318940 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8dpw5\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.332537 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.422305 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ngwt4"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.423408 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.425979 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngwt4"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.451560 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 11 09:17:55 crc kubenswrapper[4830]: W0311 09:17:55.462296 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb7be9c85_ce7e_44c6_af8e_9a1cb139036a.slice/crio-54a44c38d7e22206c8600be86c5f6087233561af542cef056cdb1693609fd79f WatchSource:0}: Error finding container 54a44c38d7e22206c8600be86c5f6087233561af542cef056cdb1693609fd79f: Status 404 returned error can't find the container with id 54a44c38d7e22206c8600be86c5f6087233561af542cef056cdb1693609fd79f Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.483914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-catalog-content\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.483969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-utilities\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.483998 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr87t\" (UniqueName: \"kubernetes.io/projected/63e229ca-0853-43c9-8ad6-a5e236df0812-kube-api-access-tr87t\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.543601 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qgv"] Mar 11 09:17:55 crc kubenswrapper[4830]: W0311 09:17:55.548887 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44447137_e7c5_4a07_bcdd_6bcc4c835f79.slice/crio-610c149787db87222d534cd474c89236f4a7fbb80e9d21c6f6a0f37a5d178f9a WatchSource:0}: Error finding container 610c149787db87222d534cd474c89236f4a7fbb80e9d21c6f6a0f37a5d178f9a: Status 404 returned error can't find the container with id 610c149787db87222d534cd474c89236f4a7fbb80e9d21c6f6a0f37a5d178f9a Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.566724 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:55 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:55 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:55 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.566786 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.584849 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-catalog-content\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.585634 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-utilities\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.585980 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr87t\" (UniqueName: \"kubernetes.io/projected/63e229ca-0853-43c9-8ad6-a5e236df0812-kube-api-access-tr87t\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.585575 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-catalog-content\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.585928 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-utilities\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.606848 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr87t\" (UniqueName: \"kubernetes.io/projected/63e229ca-0853-43c9-8ad6-a5e236df0812-kube-api-access-tr87t\") pod \"redhat-marketplace-ngwt4\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.618723 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.627440 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.674936 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.676661 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.680775 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.681209 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.681370 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.681948 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.683272 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.683538 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.683660 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78bc98697b-bcww7"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.699480 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.699539 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78bc98697b-bcww7"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.699615 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.704393 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.704518 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.704799 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.707250 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.707647 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.709730 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.736703 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.753406 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806161 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-config\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806217 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7sdh\" (UniqueName: \"kubernetes.io/projected/145c64de-fbd3-4d66-b20a-35312d942033-kube-api-access-j7sdh\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806242 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvd6n\" (UniqueName: \"kubernetes.io/projected/607fd908-111b-4a04-a6d7-09d548896c30-kube-api-access-kvd6n\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806295 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-client-ca\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806329 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-client-ca\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806350 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145c64de-fbd3-4d66-b20a-35312d942033-serving-cert\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806461 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607fd908-111b-4a04-a6d7-09d548896c30-serving-cert\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806495 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-config\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.806639 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-proxy-ca-bundles\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.909870 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-client-ca\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.910317 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145c64de-fbd3-4d66-b20a-35312d942033-serving-cert\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.910347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607fd908-111b-4a04-a6d7-09d548896c30-serving-cert\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.911371 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-client-ca\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.911542 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-config\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.911649 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-proxy-ca-bundles\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.911680 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-config\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.911718 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7sdh\" (UniqueName: \"kubernetes.io/projected/145c64de-fbd3-4d66-b20a-35312d942033-kube-api-access-j7sdh\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.911749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvd6n\" (UniqueName: \"kubernetes.io/projected/607fd908-111b-4a04-a6d7-09d548896c30-kube-api-access-kvd6n\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.911796 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-client-ca\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.912963 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-proxy-ca-bundles\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.914951 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-config\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.915115 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-config\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.916603 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607fd908-111b-4a04-a6d7-09d548896c30-serving-cert\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.917400 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-client-ca\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.926467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145c64de-fbd3-4d66-b20a-35312d942033-serving-cert\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.930530 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvd6n\" (UniqueName: \"kubernetes.io/projected/607fd908-111b-4a04-a6d7-09d548896c30-kube-api-access-kvd6n\") pod \"route-controller-manager-556c6c58b4-h72t6\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.931622 4830 ???:1] "http: TLS handshake error from 192.168.126.11:57716: no serving certificate available for the kubelet" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.934799 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7sdh\" (UniqueName: \"kubernetes.io/projected/145c64de-fbd3-4d66-b20a-35312d942033-kube-api-access-j7sdh\") pod \"controller-manager-78bc98697b-bcww7\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.970469 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6","Type":"ContainerStarted","Data":"60464a298083e0fbfcdd7b9df9c1c58a12a742e31600844761d29e1ac5270cfb"} Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.970505 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6","Type":"ContainerStarted","Data":"bb01090205845624bef0fee828dc54f28d01fc33620856331b24cf7a1fd51e70"} Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.973300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b7be9c85-ce7e-44c6-af8e-9a1cb139036a","Type":"ContainerStarted","Data":"e195af7b47878cf7f0c01d91e07e0d171b84f76df9cd344fc443a8304ca32320"} Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.973346 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b7be9c85-ce7e-44c6-af8e-9a1cb139036a","Type":"ContainerStarted","Data":"54a44c38d7e22206c8600be86c5f6087233561af542cef056cdb1693609fd79f"} Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.990573 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.990555858 podStartE2EDuration="1.990555858s" podCreationTimestamp="2026-03-11 09:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:55.987434417 +0000 UTC m=+243.768585116" watchObservedRunningTime="2026-03-11 09:17:55.990555858 +0000 UTC m=+243.771706547" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.992356 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xqggt" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.994771 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ctcw6"] Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.997555 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:55 crc kubenswrapper[4830]: I0311 09:17:55.999156 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.012915 4830 generic.go:334] "Generic (PLEG): container finished" podID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerID="17e4c1afc289200a7b31eaa35854ad7b113c5842a8bc517cc8073b90b972c75f" exitCode=0 Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.013844 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qgv" event={"ID":"44447137-e7c5-4a07-bcdd-6bcc4c835f79","Type":"ContainerDied","Data":"17e4c1afc289200a7b31eaa35854ad7b113c5842a8bc517cc8073b90b972c75f"} Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.013866 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qgv" event={"ID":"44447137-e7c5-4a07-bcdd-6bcc4c835f79","Type":"ContainerStarted","Data":"610c149787db87222d534cd474c89236f4a7fbb80e9d21c6f6a0f37a5d178f9a"} Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.013881 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ctcw6"] Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.016219 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.016197936 podStartE2EDuration="2.016197936s" podCreationTimestamp="2026-03-11 09:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:56.002223039 +0000 UTC m=+243.783373738" watchObservedRunningTime="2026-03-11 09:17:56.016197936 +0000 UTC m=+243.797348625" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.032896 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.040711 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngwt4"] Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.040865 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.055701 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dpw5"] Mar 11 09:17:56 crc kubenswrapper[4830]: W0311 09:17:56.059252 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e229ca_0853_43c9_8ad6_a5e236df0812.slice/crio-8f469d0ac98afa17f7a8a49000ff6c2d89f9ca476f952c1caa69989804c3daf3 WatchSource:0}: Error finding container 8f469d0ac98afa17f7a8a49000ff6c2d89f9ca476f952c1caa69989804c3daf3: Status 404 returned error can't find the container with id 8f469d0ac98afa17f7a8a49000ff6c2d89f9ca476f952c1caa69989804c3daf3 Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.114176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-utilities\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.114304 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8wsg\" (UniqueName: \"kubernetes.io/projected/2438f79c-45d2-4b4f-951b-630d3fb2c740-kube-api-access-s8wsg\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.114955 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-catalog-content\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: W0311 09:17:56.142428 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a7090b_02e3_4cf4_a380_4afcd01ceb1f.slice/crio-692cd07551535b6941e04158ff6a512d797a544d425a32548a370154fdf2ae46 WatchSource:0}: Error finding container 692cd07551535b6941e04158ff6a512d797a544d425a32548a370154fdf2ae46: Status 404 returned error can't find the container with id 692cd07551535b6941e04158ff6a512d797a544d425a32548a370154fdf2ae46 Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.215970 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8wsg\" (UniqueName: \"kubernetes.io/projected/2438f79c-45d2-4b4f-951b-630d3fb2c740-kube-api-access-s8wsg\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.216077 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-catalog-content\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.216134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-utilities\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.216510 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-utilities\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.216710 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-catalog-content\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.234526 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8wsg\" (UniqueName: \"kubernetes.io/projected/2438f79c-45d2-4b4f-951b-630d3fb2c740-kube-api-access-s8wsg\") pod \"redhat-operators-ctcw6\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.295105 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6"] Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.321327 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.351345 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78bc98697b-bcww7"] Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.390328 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbnm8"] Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.391658 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.408728 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbnm8"] Mar 11 09:17:56 crc kubenswrapper[4830]: W0311 09:17:56.426377 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607fd908_111b_4a04_a6d7_09d548896c30.slice/crio-a4c062f24a171d31b25a4f6df1cd7fc379cc204f2b9ef37dd225ab94b8a6ca51 WatchSource:0}: Error finding container a4c062f24a171d31b25a4f6df1cd7fc379cc204f2b9ef37dd225ab94b8a6ca51: Status 404 returned error can't find the container with id a4c062f24a171d31b25a4f6df1cd7fc379cc204f2b9ef37dd225ab94b8a6ca51 Mar 11 09:17:56 crc kubenswrapper[4830]: W0311 09:17:56.427786 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod145c64de_fbd3_4d66_b20a_35312d942033.slice/crio-90eaa76287947098a8a5b4a6b6202b6b8e60df3f03e41973ed514b13189adeaf WatchSource:0}: Error finding container 90eaa76287947098a8a5b4a6b6202b6b8e60df3f03e41973ed514b13189adeaf: Status 404 returned error can't find the container with id 90eaa76287947098a8a5b4a6b6202b6b8e60df3f03e41973ed514b13189adeaf Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.519416 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-catalog-content\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.519546 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clw26\" (UniqueName: \"kubernetes.io/projected/0110c11f-22d8-4a16-b11b-08c46b6a3bed-kube-api-access-clw26\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.519571 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-utilities\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.569451 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:56 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:56 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:56 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.569505 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.622145 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-utilities\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.622205 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-catalog-content\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.622349 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clw26\" (UniqueName: \"kubernetes.io/projected/0110c11f-22d8-4a16-b11b-08c46b6a3bed-kube-api-access-clw26\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.623236 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-utilities\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.623493 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-catalog-content\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.645380 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clw26\" (UniqueName: \"kubernetes.io/projected/0110c11f-22d8-4a16-b11b-08c46b6a3bed-kube-api-access-clw26\") pod \"redhat-operators-fbnm8\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.743660 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.750437 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-cprgf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.750482 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cprgf" podUID="3569f4d9-01f3-47be-bc27-af6a7cb999a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.751279 4830 patch_prober.go:28] interesting pod/downloads-7954f5f757-cprgf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.751302 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cprgf" podUID="3569f4d9-01f3-47be-bc27-af6a7cb999a1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.785703 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.785752 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.790377 4830 patch_prober.go:28] interesting pod/console-f9d7485db-75j46 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.790474 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-75j46" podUID="cd450036-5201-4553-a9de-c08a7a9c9f52" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.892219 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ctcw6"] Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.894358 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.906414 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-r4nnq" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.965718 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.977247 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:56 crc kubenswrapper[4830]: I0311 09:17:56.991156 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ffkfl" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.032010 4830 generic.go:334] "Generic (PLEG): container finished" podID="dd5c7d03-0ae0-4fa4-8265-ab0645b959d6" containerID="60464a298083e0fbfcdd7b9df9c1c58a12a742e31600844761d29e1ac5270cfb" exitCode=0 Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.032324 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6","Type":"ContainerDied","Data":"60464a298083e0fbfcdd7b9df9c1c58a12a742e31600844761d29e1ac5270cfb"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.045288 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" event={"ID":"145c64de-fbd3-4d66-b20a-35312d942033","Type":"ContainerStarted","Data":"45c9c055423bc6c1f4f0a01df0738270f3269ca5f78e31e3387aab6ebdaf49e0"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.045331 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" event={"ID":"145c64de-fbd3-4d66-b20a-35312d942033","Type":"ContainerStarted","Data":"90eaa76287947098a8a5b4a6b6202b6b8e60df3f03e41973ed514b13189adeaf"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.046183 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.058698 4830 generic.go:334] "Generic (PLEG): container finished" podID="b7be9c85-ce7e-44c6-af8e-9a1cb139036a" containerID="e195af7b47878cf7f0c01d91e07e0d171b84f76df9cd344fc443a8304ca32320" exitCode=0 Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.058857 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b7be9c85-ce7e-44c6-af8e-9a1cb139036a","Type":"ContainerDied","Data":"e195af7b47878cf7f0c01d91e07e0d171b84f76df9cd344fc443a8304ca32320"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.063141 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.070736 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" event={"ID":"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f","Type":"ContainerStarted","Data":"f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.070782 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" event={"ID":"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f","Type":"ContainerStarted","Data":"692cd07551535b6941e04158ff6a512d797a544d425a32548a370154fdf2ae46"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.071573 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.096904 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.097007 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" event={"ID":"607fd908-111b-4a04-a6d7-09d548896c30","Type":"ContainerStarted","Data":"e08a04673deba6b68691b142a71620ab1204d4c41ff0c6c0a1a471e98fc823c7"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.097125 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" event={"ID":"607fd908-111b-4a04-a6d7-09d548896c30","Type":"ContainerStarted","Data":"a4c062f24a171d31b25a4f6df1cd7fc379cc204f2b9ef37dd225ab94b8a6ca51"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.092644 4830 generic.go:334] "Generic (PLEG): container finished" podID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerID="1491318bbb0544d0e7220781dc57b920da53ca22b89cd99636c7e93212f442b0" exitCode=0 Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.097202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngwt4" event={"ID":"63e229ca-0853-43c9-8ad6-a5e236df0812","Type":"ContainerDied","Data":"1491318bbb0544d0e7220781dc57b920da53ca22b89cd99636c7e93212f442b0"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.097330 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngwt4" event={"ID":"63e229ca-0853-43c9-8ad6-a5e236df0812","Type":"ContainerStarted","Data":"8f469d0ac98afa17f7a8a49000ff6c2d89f9ca476f952c1caa69989804c3daf3"} Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.104960 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.128516 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" podStartSLOduration=6.128493637 podStartE2EDuration="6.128493637s" podCreationTimestamp="2026-03-11 09:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:57.119801524 +0000 UTC m=+244.900952213" watchObservedRunningTime="2026-03-11 09:17:57.128493637 +0000 UTC m=+244.909644326" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.210566 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" podStartSLOduration=175.21054897 podStartE2EDuration="2m55.21054897s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:57.167216377 +0000 UTC m=+244.948367096" watchObservedRunningTime="2026-03-11 09:17:57.21054897 +0000 UTC m=+244.991699659" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.302749 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" podStartSLOduration=5.302727318 podStartE2EDuration="5.302727318s" podCreationTimestamp="2026-03-11 09:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:17:57.302296195 +0000 UTC m=+245.083446894" watchObservedRunningTime="2026-03-11 09:17:57.302727318 +0000 UTC m=+245.083878007" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.562555 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.569378 4830 patch_prober.go:28] interesting pod/router-default-5444994796-j29w2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 11 09:17:57 crc kubenswrapper[4830]: [-]has-synced failed: reason withheld Mar 11 09:17:57 crc kubenswrapper[4830]: [+]process-running ok Mar 11 09:17:57 crc kubenswrapper[4830]: healthz check failed Mar 11 09:17:57 crc kubenswrapper[4830]: I0311 09:17:57.569449 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j29w2" podUID="dc054b59-4478-4188-a9a5-12bb26b68c96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:17:58 crc kubenswrapper[4830]: I0311 09:17:58.567866 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:17:58 crc kubenswrapper[4830]: I0311 09:17:58.577176 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-j29w2" Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.139610 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553678-nslvc"] Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.140674 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553678-nslvc" Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.144178 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.148539 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553678-nslvc"] Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.210040 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5mwq\" (UniqueName: \"kubernetes.io/projected/e577451d-6016-4afc-913a-6d022a9a2f79-kube-api-access-z5mwq\") pod \"auto-csr-approver-29553678-nslvc\" (UID: \"e577451d-6016-4afc-913a-6d022a9a2f79\") " pod="openshift-infra/auto-csr-approver-29553678-nslvc" Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.311060 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5mwq\" (UniqueName: \"kubernetes.io/projected/e577451d-6016-4afc-913a-6d022a9a2f79-kube-api-access-z5mwq\") pod \"auto-csr-approver-29553678-nslvc\" (UID: \"e577451d-6016-4afc-913a-6d022a9a2f79\") " pod="openshift-infra/auto-csr-approver-29553678-nslvc" Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.333950 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5mwq\" (UniqueName: \"kubernetes.io/projected/e577451d-6016-4afc-913a-6d022a9a2f79-kube-api-access-z5mwq\") pod \"auto-csr-approver-29553678-nslvc\" (UID: \"e577451d-6016-4afc-913a-6d022a9a2f79\") " pod="openshift-infra/auto-csr-approver-29553678-nslvc" Mar 11 09:18:00 crc kubenswrapper[4830]: I0311 09:18:00.467661 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553678-nslvc" Mar 11 09:18:01 crc kubenswrapper[4830]: I0311 09:18:01.076356 4830 ???:1] "http: TLS handshake error from 192.168.126.11:47590: no serving certificate available for the kubelet" Mar 11 09:18:01 crc kubenswrapper[4830]: I0311 09:18:01.194361 4830 ???:1] "http: TLS handshake error from 192.168.126.11:47602: no serving certificate available for the kubelet" Mar 11 09:18:02 crc kubenswrapper[4830]: I0311 09:18:02.823066 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qn6h2" Mar 11 09:18:03 crc kubenswrapper[4830]: I0311 09:18:03.897393 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:18:03 crc kubenswrapper[4830]: I0311 09:18:03.898857 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 09:18:03 crc kubenswrapper[4830]: I0311 09:18:03.917763 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e06af9c6-9acb-4a23-bc91-01fd25fa4915-metrics-certs\") pod \"network-metrics-daemon-zl7s2\" (UID: \"e06af9c6-9acb-4a23-bc91-01fd25fa4915\") " pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:18:04 crc kubenswrapper[4830]: I0311 09:18:04.164062 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 09:18:04 crc kubenswrapper[4830]: I0311 09:18:04.172457 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zl7s2" Mar 11 09:18:04 crc kubenswrapper[4830]: E0311 09:18:04.288150 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-conmon-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.060028 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.137341 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kube-api-access\") pod \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.137423 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kubelet-dir\") pod \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\" (UID: \"b7be9c85-ce7e-44c6-af8e-9a1cb139036a\") " Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.137736 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b7be9c85-ce7e-44c6-af8e-9a1cb139036a" (UID: "b7be9c85-ce7e-44c6-af8e-9a1cb139036a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.148815 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b7be9c85-ce7e-44c6-af8e-9a1cb139036a" (UID: "b7be9c85-ce7e-44c6-af8e-9a1cb139036a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.214718 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ctcw6" event={"ID":"2438f79c-45d2-4b4f-951b-630d3fb2c740","Type":"ContainerStarted","Data":"b56fa36c75fca2ce62a168f50bd57712ff1c5eacd51f1276a6625310ec00011d"} Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.216955 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b7be9c85-ce7e-44c6-af8e-9a1cb139036a","Type":"ContainerDied","Data":"54a44c38d7e22206c8600be86c5f6087233561af542cef056cdb1693609fd79f"} Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.216991 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a44c38d7e22206c8600be86c5f6087233561af542cef056cdb1693609fd79f" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.217080 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.239159 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.239184 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7be9c85-ce7e-44c6-af8e-9a1cb139036a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.761221 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-cprgf" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.798851 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:18:06 crc kubenswrapper[4830]: I0311 09:18:06.802267 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:18:07 crc kubenswrapper[4830]: E0311 09:18:07.696589 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 11 09:18:07 crc kubenswrapper[4830]: E0311 09:18:07.696757 4830 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 11 09:18:07 crc kubenswrapper[4830]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 11 09:18:07 crc kubenswrapper[4830]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l9qh2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29553676-chghx_openshift-infra(680233cf-fda8-402e-95a6-a596a0edd470): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 11 09:18:07 crc kubenswrapper[4830]: > logger="UnhandledError" Mar 11 09:18:07 crc kubenswrapper[4830]: E0311 09:18:07.698013 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29553676-chghx" podUID="680233cf-fda8-402e-95a6-a596a0edd470" Mar 11 09:18:08 crc kubenswrapper[4830]: E0311 09:18:08.227605 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29553676-chghx" podUID="680233cf-fda8-402e-95a6-a596a0edd470" Mar 11 09:18:10 crc kubenswrapper[4830]: I0311 09:18:10.694305 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78bc98697b-bcww7"] Mar 11 09:18:10 crc kubenswrapper[4830]: I0311 09:18:10.694562 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" podUID="145c64de-fbd3-4d66-b20a-35312d942033" containerName="controller-manager" containerID="cri-o://45c9c055423bc6c1f4f0a01df0738270f3269ca5f78e31e3387aab6ebdaf49e0" gracePeriod=30 Mar 11 09:18:10 crc kubenswrapper[4830]: I0311 09:18:10.713078 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6"] Mar 11 09:18:10 crc kubenswrapper[4830]: I0311 09:18:10.713383 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" podUID="607fd908-111b-4a04-a6d7-09d548896c30" containerName="route-controller-manager" containerID="cri-o://e08a04673deba6b68691b142a71620ab1204d4c41ff0c6c0a1a471e98fc823c7" gracePeriod=30 Mar 11 09:18:11 crc kubenswrapper[4830]: I0311 09:18:11.348568 4830 ???:1] "http: TLS handshake error from 192.168.126.11:40132: no serving certificate available for the kubelet" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.017446 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.154669 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kube-api-access\") pod \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.155117 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kubelet-dir\") pod \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\" (UID: \"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6\") " Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.155240 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dd5c7d03-0ae0-4fa4-8265-ab0645b959d6" (UID: "dd5c7d03-0ae0-4fa4-8265-ab0645b959d6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.155420 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.162025 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dd5c7d03-0ae0-4fa4-8265-ab0645b959d6" (UID: "dd5c7d03-0ae0-4fa4-8265-ab0645b959d6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.248685 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"dd5c7d03-0ae0-4fa4-8265-ab0645b959d6","Type":"ContainerDied","Data":"bb01090205845624bef0fee828dc54f28d01fc33620856331b24cf7a1fd51e70"} Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.248726 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb01090205845624bef0fee828dc54f28d01fc33620856331b24cf7a1fd51e70" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.248811 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.252202 4830 generic.go:334] "Generic (PLEG): container finished" podID="145c64de-fbd3-4d66-b20a-35312d942033" containerID="45c9c055423bc6c1f4f0a01df0738270f3269ca5f78e31e3387aab6ebdaf49e0" exitCode=0 Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.252270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" event={"ID":"145c64de-fbd3-4d66-b20a-35312d942033","Type":"ContainerDied","Data":"45c9c055423bc6c1f4f0a01df0738270f3269ca5f78e31e3387aab6ebdaf49e0"} Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.253694 4830 generic.go:334] "Generic (PLEG): container finished" podID="607fd908-111b-4a04-a6d7-09d548896c30" containerID="e08a04673deba6b68691b142a71620ab1204d4c41ff0c6c0a1a471e98fc823c7" exitCode=0 Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.253733 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" event={"ID":"607fd908-111b-4a04-a6d7-09d548896c30","Type":"ContainerDied","Data":"e08a04673deba6b68691b142a71620ab1204d4c41ff0c6c0a1a471e98fc823c7"} Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.256496 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd5c7d03-0ae0-4fa4-8265-ab0645b959d6-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:12 crc kubenswrapper[4830]: I0311 09:18:12.422358 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbnm8"] Mar 11 09:18:13 crc kubenswrapper[4830]: I0311 09:18:13.060758 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:18:13 crc kubenswrapper[4830]: I0311 09:18:13.061691 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:18:14 crc kubenswrapper[4830]: E0311 09:18:14.406466 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-conmon-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.533422 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.543222 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.595644 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145c64de-fbd3-4d66-b20a-35312d942033-serving-cert\") pod \"145c64de-fbd3-4d66-b20a-35312d942033\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.605268 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84cb88f566-76vbb"] Mar 11 09:18:14 crc kubenswrapper[4830]: E0311 09:18:14.605831 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607fd908-111b-4a04-a6d7-09d548896c30" containerName="route-controller-manager" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.605865 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="607fd908-111b-4a04-a6d7-09d548896c30" containerName="route-controller-manager" Mar 11 09:18:14 crc kubenswrapper[4830]: E0311 09:18:14.605886 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5c7d03-0ae0-4fa4-8265-ab0645b959d6" containerName="pruner" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.605894 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5c7d03-0ae0-4fa4-8265-ab0645b959d6" containerName="pruner" Mar 11 09:18:14 crc kubenswrapper[4830]: E0311 09:18:14.605908 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7be9c85-ce7e-44c6-af8e-9a1cb139036a" containerName="pruner" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.605917 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7be9c85-ce7e-44c6-af8e-9a1cb139036a" containerName="pruner" Mar 11 09:18:14 crc kubenswrapper[4830]: E0311 09:18:14.605932 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145c64de-fbd3-4d66-b20a-35312d942033" containerName="controller-manager" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.605942 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="145c64de-fbd3-4d66-b20a-35312d942033" containerName="controller-manager" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.606098 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="607fd908-111b-4a04-a6d7-09d548896c30" containerName="route-controller-manager" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.606116 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7be9c85-ce7e-44c6-af8e-9a1cb139036a" containerName="pruner" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.606126 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="145c64de-fbd3-4d66-b20a-35312d942033" containerName="controller-manager" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.606143 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5c7d03-0ae0-4fa4-8265-ab0645b959d6" containerName="pruner" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.606834 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.611854 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145c64de-fbd3-4d66-b20a-35312d942033-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "145c64de-fbd3-4d66-b20a-35312d942033" (UID: "145c64de-fbd3-4d66-b20a-35312d942033"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.616197 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84cb88f566-76vbb"] Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.696856 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-client-ca\") pod \"145c64de-fbd3-4d66-b20a-35312d942033\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.696950 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607fd908-111b-4a04-a6d7-09d548896c30-serving-cert\") pod \"607fd908-111b-4a04-a6d7-09d548896c30\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.697775 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-client-ca\") pod \"607fd908-111b-4a04-a6d7-09d548896c30\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.699319 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-client-ca" (OuterVolumeSpecName: "client-ca") pod "145c64de-fbd3-4d66-b20a-35312d942033" (UID: "145c64de-fbd3-4d66-b20a-35312d942033"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.701559 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-config\") pod \"145c64de-fbd3-4d66-b20a-35312d942033\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.701618 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-config\") pod \"607fd908-111b-4a04-a6d7-09d548896c30\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.701662 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvd6n\" (UniqueName: \"kubernetes.io/projected/607fd908-111b-4a04-a6d7-09d548896c30-kube-api-access-kvd6n\") pod \"607fd908-111b-4a04-a6d7-09d548896c30\" (UID: \"607fd908-111b-4a04-a6d7-09d548896c30\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.701672 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-client-ca" (OuterVolumeSpecName: "client-ca") pod "607fd908-111b-4a04-a6d7-09d548896c30" (UID: "607fd908-111b-4a04-a6d7-09d548896c30"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.701697 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-proxy-ca-bundles\") pod \"145c64de-fbd3-4d66-b20a-35312d942033\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.702048 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "145c64de-fbd3-4d66-b20a-35312d942033" (UID: "145c64de-fbd3-4d66-b20a-35312d942033"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.702052 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7sdh\" (UniqueName: \"kubernetes.io/projected/145c64de-fbd3-4d66-b20a-35312d942033-kube-api-access-j7sdh\") pod \"145c64de-fbd3-4d66-b20a-35312d942033\" (UID: \"145c64de-fbd3-4d66-b20a-35312d942033\") " Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.705735 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-config" (OuterVolumeSpecName: "config") pod "145c64de-fbd3-4d66-b20a-35312d942033" (UID: "145c64de-fbd3-4d66-b20a-35312d942033"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.706224 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145c64de-fbd3-4d66-b20a-35312d942033-kube-api-access-j7sdh" (OuterVolumeSpecName: "kube-api-access-j7sdh") pod "145c64de-fbd3-4d66-b20a-35312d942033" (UID: "145c64de-fbd3-4d66-b20a-35312d942033"). InnerVolumeSpecName "kube-api-access-j7sdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.706248 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/607fd908-111b-4a04-a6d7-09d548896c30-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "607fd908-111b-4a04-a6d7-09d548896c30" (UID: "607fd908-111b-4a04-a6d7-09d548896c30"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.707768 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-config" (OuterVolumeSpecName: "config") pod "607fd908-111b-4a04-a6d7-09d548896c30" (UID: "607fd908-111b-4a04-a6d7-09d548896c30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.708515 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607fd908-111b-4a04-a6d7-09d548896c30-kube-api-access-kvd6n" (OuterVolumeSpecName: "kube-api-access-kvd6n") pod "607fd908-111b-4a04-a6d7-09d548896c30" (UID: "607fd908-111b-4a04-a6d7-09d548896c30"). InnerVolumeSpecName "kube-api-access-kvd6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.709591 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/145c64de-fbd3-4d66-b20a-35312d942033-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.709625 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.709637 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.709651 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7sdh\" (UniqueName: \"kubernetes.io/projected/145c64de-fbd3-4d66-b20a-35312d942033-kube-api-access-j7sdh\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.709664 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/145c64de-fbd3-4d66-b20a-35312d942033-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.709674 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/607fd908-111b-4a04-a6d7-09d548896c30-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.709683 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.730353 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zl7s2"] Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.811117 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-config\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.811243 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-proxy-ca-bundles\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.811287 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1629844-d739-4be3-b555-adbd3245dd96-serving-cert\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.811482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-client-ca\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.811628 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z4w5\" (UniqueName: \"kubernetes.io/projected/f1629844-d739-4be3-b555-adbd3245dd96-kube-api-access-7z4w5\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.811777 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/607fd908-111b-4a04-a6d7-09d548896c30-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.811795 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvd6n\" (UniqueName: \"kubernetes.io/projected/607fd908-111b-4a04-a6d7-09d548896c30-kube-api-access-kvd6n\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.913037 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-config\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.913469 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-proxy-ca-bundles\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.913508 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1629844-d739-4be3-b555-adbd3245dd96-serving-cert\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.913554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-client-ca\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.913603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z4w5\" (UniqueName: \"kubernetes.io/projected/f1629844-d739-4be3-b555-adbd3245dd96-kube-api-access-7z4w5\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.915250 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-proxy-ca-bundles\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.915847 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-client-ca\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.920309 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1629844-d739-4be3-b555-adbd3245dd96-serving-cert\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.938834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z4w5\" (UniqueName: \"kubernetes.io/projected/f1629844-d739-4be3-b555-adbd3245dd96-kube-api-access-7z4w5\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:14 crc kubenswrapper[4830]: I0311 09:18:14.977948 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-config\") pod \"controller-manager-84cb88f566-76vbb\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.246902 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.273557 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbnm8" event={"ID":"0110c11f-22d8-4a16-b11b-08c46b6a3bed","Type":"ContainerStarted","Data":"3ec669405684d95fe467a6b375592fd74e97daaf4b56f1448c25a983f513d9b4"} Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.276045 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" event={"ID":"607fd908-111b-4a04-a6d7-09d548896c30","Type":"ContainerDied","Data":"a4c062f24a171d31b25a4f6df1cd7fc379cc204f2b9ef37dd225ab94b8a6ca51"} Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.276092 4830 scope.go:117] "RemoveContainer" containerID="e08a04673deba6b68691b142a71620ab1204d4c41ff0c6c0a1a471e98fc823c7" Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.276201 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6" Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.280184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" event={"ID":"145c64de-fbd3-4d66-b20a-35312d942033","Type":"ContainerDied","Data":"90eaa76287947098a8a5b4a6b6202b6b8e60df3f03e41973ed514b13189adeaf"} Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.280321 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bc98697b-bcww7" Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.305381 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6"] Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.312931 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556c6c58b4-h72t6"] Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.317188 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78bc98697b-bcww7"] Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.323632 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78bc98697b-bcww7"] Mar 11 09:18:15 crc kubenswrapper[4830]: I0311 09:18:15.632718 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:18:16 crc kubenswrapper[4830]: I0311 09:18:16.942930 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145c64de-fbd3-4d66-b20a-35312d942033" path="/var/lib/kubelet/pods/145c64de-fbd3-4d66-b20a-35312d942033/volumes" Mar 11 09:18:16 crc kubenswrapper[4830]: I0311 09:18:16.944526 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607fd908-111b-4a04-a6d7-09d548896c30" path="/var/lib/kubelet/pods/607fd908-111b-4a04-a6d7-09d548896c30/volumes" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.699503 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw"] Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.702115 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.708337 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw"] Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.708536 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.708719 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.708935 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.709058 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.709484 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.709915 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.871818 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b22696f-6563-4c9b-a3a9-872ee3341ce0-serving-cert\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.871870 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh7lm\" (UniqueName: \"kubernetes.io/projected/1b22696f-6563-4c9b-a3a9-872ee3341ce0-kube-api-access-dh7lm\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.871911 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-config\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.872012 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-client-ca\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.973700 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-client-ca\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.973759 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b22696f-6563-4c9b-a3a9-872ee3341ce0-serving-cert\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.973778 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh7lm\" (UniqueName: \"kubernetes.io/projected/1b22696f-6563-4c9b-a3a9-872ee3341ce0-kube-api-access-dh7lm\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.973806 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-config\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.974925 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-config\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.975531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-client-ca\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.979835 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b22696f-6563-4c9b-a3a9-872ee3341ce0-serving-cert\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:18 crc kubenswrapper[4830]: I0311 09:18:18.997855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh7lm\" (UniqueName: \"kubernetes.io/projected/1b22696f-6563-4c9b-a3a9-872ee3341ce0-kube-api-access-dh7lm\") pod \"route-controller-manager-7b484bbb56-vv7zw\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:19 crc kubenswrapper[4830]: I0311 09:18:19.029471 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:22 crc kubenswrapper[4830]: E0311 09:18:22.052562 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 09:18:22 crc kubenswrapper[4830]: E0311 09:18:22.053038 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cnkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vvk9l_openshift-marketplace(e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 09:18:22 crc kubenswrapper[4830]: E0311 09:18:22.054281 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vvk9l" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" Mar 11 09:18:24 crc kubenswrapper[4830]: E0311 09:18:24.587338 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-conmon-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:18:25 crc kubenswrapper[4830]: E0311 09:18:25.842422 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vvk9l" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" Mar 11 09:18:25 crc kubenswrapper[4830]: W0311 09:18:25.847148 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode06af9c6_9acb_4a23_bc91_01fd25fa4915.slice/crio-4b5ed5a2c691cbc92b22d8a5142a6be2fe176713a3f94dddcd494ad5075d57af WatchSource:0}: Error finding container 4b5ed5a2c691cbc92b22d8a5142a6be2fe176713a3f94dddcd494ad5075d57af: Status 404 returned error can't find the container with id 4b5ed5a2c691cbc92b22d8a5142a6be2fe176713a3f94dddcd494ad5075d57af Mar 11 09:18:25 crc kubenswrapper[4830]: E0311 09:18:25.919093 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 11 09:18:25 crc kubenswrapper[4830]: E0311 09:18:25.919473 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmw9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-m6st5_openshift-marketplace(b6155028-4ba3-48be-b83d-7bbe65f28ba7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 09:18:25 crc kubenswrapper[4830]: E0311 09:18:25.920978 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-m6st5" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" Mar 11 09:18:26 crc kubenswrapper[4830]: I0311 09:18:26.343652 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" event={"ID":"e06af9c6-9acb-4a23-bc91-01fd25fa4915","Type":"ContainerStarted","Data":"4b5ed5a2c691cbc92b22d8a5142a6be2fe176713a3f94dddcd494ad5075d57af"} Mar 11 09:18:27 crc kubenswrapper[4830]: E0311 09:18:27.649562 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-m6st5" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" Mar 11 09:18:27 crc kubenswrapper[4830]: I0311 09:18:27.654054 4830 scope.go:117] "RemoveContainer" containerID="45c9c055423bc6c1f4f0a01df0738270f3269ca5f78e31e3387aab6ebdaf49e0" Mar 11 09:18:27 crc kubenswrapper[4830]: I0311 09:18:27.673636 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zdtbz" Mar 11 09:18:27 crc kubenswrapper[4830]: E0311 09:18:27.734922 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 09:18:27 crc kubenswrapper[4830]: E0311 09:18:27.735157 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhwgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wbnft_openshift-marketplace(ebefef77-9e3b-45d5-8301-53df1b75c9bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 09:18:27 crc kubenswrapper[4830]: E0311 09:18:27.737079 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wbnft" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" Mar 11 09:18:27 crc kubenswrapper[4830]: I0311 09:18:27.859407 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553678-nslvc"] Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.173815 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.175813 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.178091 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.178234 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.180074 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.239645 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc42ea8b-f92f-4350-95d9-ae48faf461de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.239799 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc42ea8b-f92f-4350-95d9-ae48faf461de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.340904 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc42ea8b-f92f-4350-95d9-ae48faf461de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.341062 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc42ea8b-f92f-4350-95d9-ae48faf461de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.341409 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc42ea8b-f92f-4350-95d9-ae48faf461de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.374773 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc42ea8b-f92f-4350-95d9-ae48faf461de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: E0311 09:18:29.384203 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wbnft" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" Mar 11 09:18:29 crc kubenswrapper[4830]: E0311 09:18:29.491741 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 11 09:18:29 crc kubenswrapper[4830]: E0311 09:18:29.491916 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhjg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x9zpp_openshift-marketplace(7febf059-370d-4a68-a543-3b23879ba479): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 11 09:18:29 crc kubenswrapper[4830]: E0311 09:18:29.493335 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x9zpp" podUID="7febf059-370d-4a68-a543-3b23879ba479" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.513817 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.683468 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw"] Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.756051 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84cb88f566-76vbb"] Mar 11 09:18:29 crc kubenswrapper[4830]: W0311 09:18:29.774079 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b22696f_6563_4c9b_a3a9_872ee3341ce0.slice/crio-ddb468ab52d8efaeca04361d90eb970f600261dd23c43772462c1a439c652a7c WatchSource:0}: Error finding container ddb468ab52d8efaeca04361d90eb970f600261dd23c43772462c1a439c652a7c: Status 404 returned error can't find the container with id ddb468ab52d8efaeca04361d90eb970f600261dd23c43772462c1a439c652a7c Mar 11 09:18:29 crc kubenswrapper[4830]: I0311 09:18:29.819200 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.367461 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" event={"ID":"e06af9c6-9acb-4a23-bc91-01fd25fa4915","Type":"ContainerStarted","Data":"e68bdaed9822aea132581a1cd95e1aba764d91d33d6e73fe42298a8ec74a8628"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.368094 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zl7s2" event={"ID":"e06af9c6-9acb-4a23-bc91-01fd25fa4915","Type":"ContainerStarted","Data":"81e663c3f1a4fb008ef71f52aede2028d8f7435cb03c75db97e7c6930c9d5a51"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.369666 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"cc42ea8b-f92f-4350-95d9-ae48faf461de","Type":"ContainerStarted","Data":"09087a4459a081e97ff5b7a16ad06407d317690fa3dd1b621a1965ad52b0d1fa"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.369700 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"cc42ea8b-f92f-4350-95d9-ae48faf461de","Type":"ContainerStarted","Data":"2560e048f1b9bda61e47f5dc93d6fd378b01122f1167094489e839e503ddf914"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.377224 4830 generic.go:334] "Generic (PLEG): container finished" podID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerID="f7857ec02c55ae21e66269ceca325950789db592646ee7356fc0c84b682d958b" exitCode=0 Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.377256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qgv" event={"ID":"44447137-e7c5-4a07-bcdd-6bcc4c835f79","Type":"ContainerDied","Data":"f7857ec02c55ae21e66269ceca325950789db592646ee7356fc0c84b682d958b"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.382972 4830 generic.go:334] "Generic (PLEG): container finished" podID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerID="3af0514b220b7146be1d0c827c2887bc089d1d948456c5abcad32e2dafa02160" exitCode=0 Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.383064 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngwt4" event={"ID":"63e229ca-0853-43c9-8ad6-a5e236df0812","Type":"ContainerDied","Data":"3af0514b220b7146be1d0c827c2887bc089d1d948456c5abcad32e2dafa02160"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.392540 4830 generic.go:334] "Generic (PLEG): container finished" podID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerID="b7253f3f04390e521eed635ac3c777bec154573f4ee5d1b2a869a993d1e12bd5" exitCode=0 Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.392623 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ctcw6" event={"ID":"2438f79c-45d2-4b4f-951b-630d3fb2c740","Type":"ContainerDied","Data":"b7253f3f04390e521eed635ac3c777bec154573f4ee5d1b2a869a993d1e12bd5"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.399434 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553678-nslvc" event={"ID":"e577451d-6016-4afc-913a-6d022a9a2f79","Type":"ContainerStarted","Data":"7b98d6783f65a5e2d1701072336a4934be4cf25f438018f032cff9df3401a1c2"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.404487 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zl7s2" podStartSLOduration=208.404460664 podStartE2EDuration="3m28.404460664s" podCreationTimestamp="2026-03-11 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:30.39554933 +0000 UTC m=+278.176700039" watchObservedRunningTime="2026-03-11 09:18:30.404460664 +0000 UTC m=+278.185611363" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.408076 4830 generic.go:334] "Generic (PLEG): container finished" podID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerID="63db21c8ead7380860951efa76345f6bc5d2abfc3dd94047883a131e5ad3f456" exitCode=0 Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.408179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbnm8" event={"ID":"0110c11f-22d8-4a16-b11b-08c46b6a3bed","Type":"ContainerDied","Data":"63db21c8ead7380860951efa76345f6bc5d2abfc3dd94047883a131e5ad3f456"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.411098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" event={"ID":"1b22696f-6563-4c9b-a3a9-872ee3341ce0","Type":"ContainerStarted","Data":"ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.411155 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" event={"ID":"1b22696f-6563-4c9b-a3a9-872ee3341ce0","Type":"ContainerStarted","Data":"ddb468ab52d8efaeca04361d90eb970f600261dd23c43772462c1a439c652a7c"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.411779 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.419853 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553676-chghx" event={"ID":"680233cf-fda8-402e-95a6-a596a0edd470","Type":"ContainerStarted","Data":"86dfda9e99c2b7e208ead4e5c5eca235f1e785d63b7a9507bc27070f8ee8c62f"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.421642 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.421622252 podStartE2EDuration="1.421622252s" podCreationTimestamp="2026-03-11 09:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:30.420185532 +0000 UTC m=+278.201336241" watchObservedRunningTime="2026-03-11 09:18:30.421622252 +0000 UTC m=+278.202772941" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.422522 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" event={"ID":"f1629844-d739-4be3-b555-adbd3245dd96","Type":"ContainerStarted","Data":"50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.422561 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" event={"ID":"f1629844-d739-4be3-b555-adbd3245dd96","Type":"ContainerStarted","Data":"5dae0dbfef60148aedfd1ed3af0b103ed57dcfb96173908147922a5b028d6caf"} Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.423333 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:30 crc kubenswrapper[4830]: E0311 09:18:30.437044 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x9zpp" podUID="7febf059-370d-4a68-a543-3b23879ba479" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.440289 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.500347 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.625706 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" podStartSLOduration=20.625690359 podStartE2EDuration="20.625690359s" podCreationTimestamp="2026-03-11 09:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:30.60994474 +0000 UTC m=+278.391095419" watchObservedRunningTime="2026-03-11 09:18:30.625690359 +0000 UTC m=+278.406841048" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.626148 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553676-chghx" podStartSLOduration=109.495063381 podStartE2EDuration="2m30.626145001s" podCreationTimestamp="2026-03-11 09:16:00 +0000 UTC" firstStartedPulling="2026-03-11 09:17:48.523417059 +0000 UTC m=+236.304567748" lastFinishedPulling="2026-03-11 09:18:29.654498679 +0000 UTC m=+277.435649368" observedRunningTime="2026-03-11 09:18:30.622995962 +0000 UTC m=+278.404146651" watchObservedRunningTime="2026-03-11 09:18:30.626145001 +0000 UTC m=+278.407295690" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.651518 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" podStartSLOduration=20.651486365 podStartE2EDuration="20.651486365s" podCreationTimestamp="2026-03-11 09:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:30.647341436 +0000 UTC m=+278.428492125" watchObservedRunningTime="2026-03-11 09:18:30.651486365 +0000 UTC m=+278.432637054" Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.755775 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84cb88f566-76vbb"] Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.801191 4830 csr.go:261] certificate signing request csr-n7fdj is approved, waiting to be issued Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.807582 4830 csr.go:257] certificate signing request csr-n7fdj is issued Mar 11 09:18:30 crc kubenswrapper[4830]: I0311 09:18:30.862162 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw"] Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.437736 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qgv" event={"ID":"44447137-e7c5-4a07-bcdd-6bcc4c835f79","Type":"ContainerStarted","Data":"81df5ae2df36edb1b9c15a9635b190d74b30a0827b8307a98022c6328c0c3a87"} Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.441319 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngwt4" event={"ID":"63e229ca-0853-43c9-8ad6-a5e236df0812","Type":"ContainerStarted","Data":"c7e98b863bd7960e112ea6ce3c201c7bbb5b86b55dd2feede7382ee6727a58f5"} Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.443545 4830 generic.go:334] "Generic (PLEG): container finished" podID="680233cf-fda8-402e-95a6-a596a0edd470" containerID="86dfda9e99c2b7e208ead4e5c5eca235f1e785d63b7a9507bc27070f8ee8c62f" exitCode=0 Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.443652 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553676-chghx" event={"ID":"680233cf-fda8-402e-95a6-a596a0edd470","Type":"ContainerDied","Data":"86dfda9e99c2b7e208ead4e5c5eca235f1e785d63b7a9507bc27070f8ee8c62f"} Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.445876 4830 generic.go:334] "Generic (PLEG): container finished" podID="e577451d-6016-4afc-913a-6d022a9a2f79" containerID="8e2e106054c1b0bd60e49b1e7ace0ec94301e773ddcc72e373e291a92e695311" exitCode=0 Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.445942 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553678-nslvc" event={"ID":"e577451d-6016-4afc-913a-6d022a9a2f79","Type":"ContainerDied","Data":"8e2e106054c1b0bd60e49b1e7ace0ec94301e773ddcc72e373e291a92e695311"} Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.450093 4830 generic.go:334] "Generic (PLEG): container finished" podID="cc42ea8b-f92f-4350-95d9-ae48faf461de" containerID="09087a4459a081e97ff5b7a16ad06407d317690fa3dd1b621a1965ad52b0d1fa" exitCode=0 Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.450285 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"cc42ea8b-f92f-4350-95d9-ae48faf461de","Type":"ContainerDied","Data":"09087a4459a081e97ff5b7a16ad06407d317690fa3dd1b621a1965ad52b0d1fa"} Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.483703 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5qgv" podStartSLOduration=2.688853002 podStartE2EDuration="37.483685732s" podCreationTimestamp="2026-03-11 09:17:54 +0000 UTC" firstStartedPulling="2026-03-11 09:17:56.024470897 +0000 UTC m=+243.805621586" lastFinishedPulling="2026-03-11 09:18:30.819303627 +0000 UTC m=+278.600454316" observedRunningTime="2026-03-11 09:18:31.463335502 +0000 UTC m=+279.244486191" watchObservedRunningTime="2026-03-11 09:18:31.483685732 +0000 UTC m=+279.264836421" Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.483805 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ngwt4" podStartSLOduration=10.458979674 podStartE2EDuration="36.483801606s" podCreationTimestamp="2026-03-11 09:17:55 +0000 UTC" firstStartedPulling="2026-03-11 09:18:04.825453258 +0000 UTC m=+252.606603957" lastFinishedPulling="2026-03-11 09:18:30.8502752 +0000 UTC m=+278.631425889" observedRunningTime="2026-03-11 09:18:31.480601455 +0000 UTC m=+279.261752154" watchObservedRunningTime="2026-03-11 09:18:31.483801606 +0000 UTC m=+279.264952295" Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.808811 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-30 18:19:15.836127951 +0000 UTC Mar 11 09:18:31 crc kubenswrapper[4830]: I0311 09:18:31.808856 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6345h0m44.027275531s for next certificate rotation Mar 11 09:18:32 crc kubenswrapper[4830]: I0311 09:18:32.473858 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" podUID="1b22696f-6563-4c9b-a3a9-872ee3341ce0" containerName="route-controller-manager" containerID="cri-o://ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2" gracePeriod=30 Mar 11 09:18:32 crc kubenswrapper[4830]: I0311 09:18:32.476475 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" podUID="f1629844-d739-4be3-b555-adbd3245dd96" containerName="controller-manager" containerID="cri-o://50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe" gracePeriod=30 Mar 11 09:18:32 crc kubenswrapper[4830]: I0311 09:18:32.806384 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553678-nslvc" Mar 11 09:18:32 crc kubenswrapper[4830]: I0311 09:18:32.809522 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 00:37:10.223452216 +0000 UTC Mar 11 09:18:32 crc kubenswrapper[4830]: I0311 09:18:32.809573 4830 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6999h18m37.413882835s for next certificate rotation Mar 11 09:18:32 crc kubenswrapper[4830]: I0311 09:18:32.956086 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:32 crc kubenswrapper[4830]: I0311 09:18:32.964525 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553676-chghx" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:32.999483 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5mwq\" (UniqueName: \"kubernetes.io/projected/e577451d-6016-4afc-913a-6d022a9a2f79-kube-api-access-z5mwq\") pod \"e577451d-6016-4afc-913a-6d022a9a2f79\" (UID: \"e577451d-6016-4afc-913a-6d022a9a2f79\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.007980 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e577451d-6016-4afc-913a-6d022a9a2f79-kube-api-access-z5mwq" (OuterVolumeSpecName: "kube-api-access-z5mwq") pod "e577451d-6016-4afc-913a-6d022a9a2f79" (UID: "e577451d-6016-4afc-913a-6d022a9a2f79"). InnerVolumeSpecName "kube-api-access-z5mwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.017652 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.048663 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.101481 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9qh2\" (UniqueName: \"kubernetes.io/projected/680233cf-fda8-402e-95a6-a596a0edd470-kube-api-access-l9qh2\") pod \"680233cf-fda8-402e-95a6-a596a0edd470\" (UID: \"680233cf-fda8-402e-95a6-a596a0edd470\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.101589 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc42ea8b-f92f-4350-95d9-ae48faf461de-kube-api-access\") pod \"cc42ea8b-f92f-4350-95d9-ae48faf461de\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.101607 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc42ea8b-f92f-4350-95d9-ae48faf461de-kubelet-dir\") pod \"cc42ea8b-f92f-4350-95d9-ae48faf461de\" (UID: \"cc42ea8b-f92f-4350-95d9-ae48faf461de\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.101924 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5mwq\" (UniqueName: \"kubernetes.io/projected/e577451d-6016-4afc-913a-6d022a9a2f79-kube-api-access-z5mwq\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.101967 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc42ea8b-f92f-4350-95d9-ae48faf461de-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cc42ea8b-f92f-4350-95d9-ae48faf461de" (UID: "cc42ea8b-f92f-4350-95d9-ae48faf461de"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.105604 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc42ea8b-f92f-4350-95d9-ae48faf461de-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cc42ea8b-f92f-4350-95d9-ae48faf461de" (UID: "cc42ea8b-f92f-4350-95d9-ae48faf461de"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.106871 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680233cf-fda8-402e-95a6-a596a0edd470-kube-api-access-l9qh2" (OuterVolumeSpecName: "kube-api-access-l9qh2") pod "680233cf-fda8-402e-95a6-a596a0edd470" (UID: "680233cf-fda8-402e-95a6-a596a0edd470"). InnerVolumeSpecName "kube-api-access-l9qh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203477 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-client-ca\") pod \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203614 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b22696f-6563-4c9b-a3a9-872ee3341ce0-serving-cert\") pod \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203692 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh7lm\" (UniqueName: \"kubernetes.io/projected/1b22696f-6563-4c9b-a3a9-872ee3341ce0-kube-api-access-dh7lm\") pod \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203725 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-config\") pod \"f1629844-d739-4be3-b555-adbd3245dd96\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203746 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-client-ca\") pod \"f1629844-d739-4be3-b555-adbd3245dd96\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203776 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-proxy-ca-bundles\") pod \"f1629844-d739-4be3-b555-adbd3245dd96\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203798 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-config\") pod \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\" (UID: \"1b22696f-6563-4c9b-a3a9-872ee3341ce0\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203813 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1629844-d739-4be3-b555-adbd3245dd96-serving-cert\") pod \"f1629844-d739-4be3-b555-adbd3245dd96\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.203833 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z4w5\" (UniqueName: \"kubernetes.io/projected/f1629844-d739-4be3-b555-adbd3245dd96-kube-api-access-7z4w5\") pod \"f1629844-d739-4be3-b555-adbd3245dd96\" (UID: \"f1629844-d739-4be3-b555-adbd3245dd96\") " Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.204076 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc42ea8b-f92f-4350-95d9-ae48faf461de-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.204094 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cc42ea8b-f92f-4350-95d9-ae48faf461de-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.204104 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9qh2\" (UniqueName: \"kubernetes.io/projected/680233cf-fda8-402e-95a6-a596a0edd470-kube-api-access-l9qh2\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.204834 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f1629844-d739-4be3-b555-adbd3245dd96" (UID: "f1629844-d739-4be3-b555-adbd3245dd96"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.204855 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-client-ca" (OuterVolumeSpecName: "client-ca") pod "f1629844-d739-4be3-b555-adbd3245dd96" (UID: "f1629844-d739-4be3-b555-adbd3245dd96"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.204903 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b22696f-6563-4c9b-a3a9-872ee3341ce0" (UID: "1b22696f-6563-4c9b-a3a9-872ee3341ce0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.205001 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-config" (OuterVolumeSpecName: "config") pod "f1629844-d739-4be3-b555-adbd3245dd96" (UID: "f1629844-d739-4be3-b555-adbd3245dd96"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.205408 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-config" (OuterVolumeSpecName: "config") pod "1b22696f-6563-4c9b-a3a9-872ee3341ce0" (UID: "1b22696f-6563-4c9b-a3a9-872ee3341ce0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.208209 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b22696f-6563-4c9b-a3a9-872ee3341ce0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b22696f-6563-4c9b-a3a9-872ee3341ce0" (UID: "1b22696f-6563-4c9b-a3a9-872ee3341ce0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.209176 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1629844-d739-4be3-b555-adbd3245dd96-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f1629844-d739-4be3-b555-adbd3245dd96" (UID: "f1629844-d739-4be3-b555-adbd3245dd96"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.211990 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b22696f-6563-4c9b-a3a9-872ee3341ce0-kube-api-access-dh7lm" (OuterVolumeSpecName: "kube-api-access-dh7lm") pod "1b22696f-6563-4c9b-a3a9-872ee3341ce0" (UID: "1b22696f-6563-4c9b-a3a9-872ee3341ce0"). InnerVolumeSpecName "kube-api-access-dh7lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.212515 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1629844-d739-4be3-b555-adbd3245dd96-kube-api-access-7z4w5" (OuterVolumeSpecName: "kube-api-access-7z4w5") pod "f1629844-d739-4be3-b555-adbd3245dd96" (UID: "f1629844-d739-4be3-b555-adbd3245dd96"). InnerVolumeSpecName "kube-api-access-7z4w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306157 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b22696f-6563-4c9b-a3a9-872ee3341ce0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306230 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh7lm\" (UniqueName: \"kubernetes.io/projected/1b22696f-6563-4c9b-a3a9-872ee3341ce0-kube-api-access-dh7lm\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306253 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306274 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306293 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f1629844-d739-4be3-b555-adbd3245dd96-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306312 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306329 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1629844-d739-4be3-b555-adbd3245dd96-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306348 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z4w5\" (UniqueName: \"kubernetes.io/projected/f1629844-d739-4be3-b555-adbd3245dd96-kube-api-access-7z4w5\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.306365 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b22696f-6563-4c9b-a3a9-872ee3341ce0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.475036 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553676-chghx" event={"ID":"680233cf-fda8-402e-95a6-a596a0edd470","Type":"ContainerDied","Data":"eb33a8b76e0b36f1440e218b0e3689d921e460afff0ca753982e13615ac329c3"} Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.475099 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb33a8b76e0b36f1440e218b0e3689d921e460afff0ca753982e13615ac329c3" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.475194 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553676-chghx" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.501051 4830 generic.go:334] "Generic (PLEG): container finished" podID="f1629844-d739-4be3-b555-adbd3245dd96" containerID="50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe" exitCode=0 Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.501132 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" event={"ID":"f1629844-d739-4be3-b555-adbd3245dd96","Type":"ContainerDied","Data":"50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe"} Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.501165 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" event={"ID":"f1629844-d739-4be3-b555-adbd3245dd96","Type":"ContainerDied","Data":"5dae0dbfef60148aedfd1ed3af0b103ed57dcfb96173908147922a5b028d6caf"} Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.501186 4830 scope.go:117] "RemoveContainer" containerID="50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.501316 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84cb88f566-76vbb" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.540401 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553678-nslvc" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.540414 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553678-nslvc" event={"ID":"e577451d-6016-4afc-913a-6d022a9a2f79","Type":"ContainerDied","Data":"7b98d6783f65a5e2d1701072336a4934be4cf25f438018f032cff9df3401a1c2"} Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.541649 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b98d6783f65a5e2d1701072336a4934be4cf25f438018f032cff9df3401a1c2" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.545481 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-84cb88f566-76vbb"] Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.552587 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-84cb88f566-76vbb"] Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.554713 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"cc42ea8b-f92f-4350-95d9-ae48faf461de","Type":"ContainerDied","Data":"2560e048f1b9bda61e47f5dc93d6fd378b01122f1167094489e839e503ddf914"} Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.554757 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2560e048f1b9bda61e47f5dc93d6fd378b01122f1167094489e839e503ddf914" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.554821 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.568225 4830 scope.go:117] "RemoveContainer" containerID="50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.568367 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b22696f-6563-4c9b-a3a9-872ee3341ce0" containerID="ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2" exitCode=0 Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.568404 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" event={"ID":"1b22696f-6563-4c9b-a3a9-872ee3341ce0","Type":"ContainerDied","Data":"ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2"} Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.568430 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" event={"ID":"1b22696f-6563-4c9b-a3a9-872ee3341ce0","Type":"ContainerDied","Data":"ddb468ab52d8efaeca04361d90eb970f600261dd23c43772462c1a439c652a7c"} Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.568476 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw" Mar 11 09:18:33 crc kubenswrapper[4830]: E0311 09:18:33.569825 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe\": container with ID starting with 50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe not found: ID does not exist" containerID="50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.570191 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe"} err="failed to get container status \"50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe\": rpc error: code = NotFound desc = could not find container \"50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe\": container with ID starting with 50cdf3eaf5cdb9c3977a20b37ac51a58190f90921a0e42f8dda74c8fa1b7affe not found: ID does not exist" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.570230 4830 scope.go:117] "RemoveContainer" containerID="ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.597117 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw"] Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.598307 4830 scope.go:117] "RemoveContainer" containerID="ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.600612 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b484bbb56-vv7zw"] Mar 11 09:18:33 crc kubenswrapper[4830]: E0311 09:18:33.600671 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2\": container with ID starting with ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2 not found: ID does not exist" containerID="ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.600848 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2"} err="failed to get container status \"ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2\": rpc error: code = NotFound desc = could not find container \"ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2\": container with ID starting with ae87d59727711dd52d92a96496ae5cff53b5dfd51f9bfabd7043ab7fdc2dfec2 not found: ID does not exist" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.956559 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 09:18:33 crc kubenswrapper[4830]: E0311 09:18:33.956949 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e577451d-6016-4afc-913a-6d022a9a2f79" containerName="oc" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957082 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e577451d-6016-4afc-913a-6d022a9a2f79" containerName="oc" Mar 11 09:18:33 crc kubenswrapper[4830]: E0311 09:18:33.957093 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1629844-d739-4be3-b555-adbd3245dd96" containerName="controller-manager" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957099 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1629844-d739-4be3-b555-adbd3245dd96" containerName="controller-manager" Mar 11 09:18:33 crc kubenswrapper[4830]: E0311 09:18:33.957108 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680233cf-fda8-402e-95a6-a596a0edd470" containerName="oc" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957116 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="680233cf-fda8-402e-95a6-a596a0edd470" containerName="oc" Mar 11 09:18:33 crc kubenswrapper[4830]: E0311 09:18:33.957133 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b22696f-6563-4c9b-a3a9-872ee3341ce0" containerName="route-controller-manager" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957139 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b22696f-6563-4c9b-a3a9-872ee3341ce0" containerName="route-controller-manager" Mar 11 09:18:33 crc kubenswrapper[4830]: E0311 09:18:33.957152 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc42ea8b-f92f-4350-95d9-ae48faf461de" containerName="pruner" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957157 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc42ea8b-f92f-4350-95d9-ae48faf461de" containerName="pruner" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957257 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc42ea8b-f92f-4350-95d9-ae48faf461de" containerName="pruner" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957273 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b22696f-6563-4c9b-a3a9-872ee3341ce0" containerName="route-controller-manager" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957281 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="680233cf-fda8-402e-95a6-a596a0edd470" containerName="oc" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957293 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e577451d-6016-4afc-913a-6d022a9a2f79" containerName="oc" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957303 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1629844-d739-4be3-b555-adbd3245dd96" containerName="controller-manager" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.957696 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.961442 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.961653 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 11 09:18:33 crc kubenswrapper[4830]: I0311 09:18:33.965810 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.122546 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.122609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-var-lock\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.122643 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kube-api-access\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.224603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kube-api-access\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.224791 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.224949 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.225821 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-var-lock\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.225783 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-var-lock\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.251180 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kube-api-access\") pod \"installer-9-crc\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.278552 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.649744 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 11 09:18:34 crc kubenswrapper[4830]: W0311 09:18:34.659191 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podca0a5270_19fb_4d44_8fa5_24c0fd6eed32.slice/crio-bfa0e8b9d7845a51678aca97078a788d6701f0d25778c6ee25ed651fb7c496a3 WatchSource:0}: Error finding container bfa0e8b9d7845a51678aca97078a788d6701f0d25778c6ee25ed651fb7c496a3: Status 404 returned error can't find the container with id bfa0e8b9d7845a51678aca97078a788d6701f0d25778c6ee25ed651fb7c496a3 Mar 11 09:18:34 crc kubenswrapper[4830]: E0311 09:18:34.721774 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-conmon-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.946905 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b22696f-6563-4c9b-a3a9-872ee3341ce0" path="/var/lib/kubelet/pods/1b22696f-6563-4c9b-a3a9-872ee3341ce0/volumes" Mar 11 09:18:34 crc kubenswrapper[4830]: I0311 09:18:34.948241 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1629844-d739-4be3-b555-adbd3245dd96" path="/var/lib/kubelet/pods/f1629844-d739-4be3-b555-adbd3245dd96/volumes" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.333515 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.333599 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.584763 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32","Type":"ContainerStarted","Data":"013768beeee63f95bb4a132abca269d76730d6d2c9c092020140d830294c1f19"} Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.584805 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32","Type":"ContainerStarted","Data":"bfa0e8b9d7845a51678aca97078a788d6701f0d25778c6ee25ed651fb7c496a3"} Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.603384 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.6033674060000003 podStartE2EDuration="2.603367406s" podCreationTimestamp="2026-03-11 09:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:35.602998015 +0000 UTC m=+283.384148704" watchObservedRunningTime="2026-03-11 09:18:35.603367406 +0000 UTC m=+283.384518095" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.657624 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.711375 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dcc675794-sr9tg"] Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.712144 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.715003 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9"] Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.715746 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.717630 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dcc675794-sr9tg"] Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.717645 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.717706 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.717834 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.717927 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.718439 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.718537 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.718837 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.719489 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.719588 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.719799 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.719900 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.720467 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9"] Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.723615 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.726774 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.753842 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.753884 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.801865 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.855353 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-proxy-ca-bundles\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.855409 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-client-ca\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.855446 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-client-ca\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.856191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d5eb9d8-375b-4301-b79c-4fb7167c683c-serving-cert\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.856444 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a24e6e-443e-4561-ac4e-22f51f76b4d4-serving-cert\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.856748 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-config\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.857888 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-config\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.857963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thb8x\" (UniqueName: \"kubernetes.io/projected/17a24e6e-443e-4561-ac4e-22f51f76b4d4-kube-api-access-thb8x\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.858966 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5596\" (UniqueName: \"kubernetes.io/projected/5d5eb9d8-375b-4301-b79c-4fb7167c683c-kube-api-access-w5596\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.960266 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a24e6e-443e-4561-ac4e-22f51f76b4d4-serving-cert\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961267 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-config\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-config\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961323 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thb8x\" (UniqueName: \"kubernetes.io/projected/17a24e6e-443e-4561-ac4e-22f51f76b4d4-kube-api-access-thb8x\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961375 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5596\" (UniqueName: \"kubernetes.io/projected/5d5eb9d8-375b-4301-b79c-4fb7167c683c-kube-api-access-w5596\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961402 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-proxy-ca-bundles\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961426 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-client-ca\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961446 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-client-ca\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.961466 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d5eb9d8-375b-4301-b79c-4fb7167c683c-serving-cert\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.962608 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-client-ca\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.962667 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-proxy-ca-bundles\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.962646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-client-ca\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.962931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-config\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.966861 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d5eb9d8-375b-4301-b79c-4fb7167c683c-serving-cert\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.967987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-config\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.975553 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a24e6e-443e-4561-ac4e-22f51f76b4d4-serving-cert\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.979470 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5596\" (UniqueName: \"kubernetes.io/projected/5d5eb9d8-375b-4301-b79c-4fb7167c683c-kube-api-access-w5596\") pod \"route-controller-manager-5579ccb9bf-xhqk9\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:35 crc kubenswrapper[4830]: I0311 09:18:35.984764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thb8x\" (UniqueName: \"kubernetes.io/projected/17a24e6e-443e-4561-ac4e-22f51f76b4d4-kube-api-access-thb8x\") pod \"controller-manager-7dcc675794-sr9tg\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:36 crc kubenswrapper[4830]: I0311 09:18:36.069345 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:36 crc kubenswrapper[4830]: I0311 09:18:36.078652 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:36 crc kubenswrapper[4830]: I0311 09:18:36.479046 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dcc675794-sr9tg"] Mar 11 09:18:36 crc kubenswrapper[4830]: I0311 09:18:36.534358 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9"] Mar 11 09:18:36 crc kubenswrapper[4830]: W0311 09:18:36.552068 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d5eb9d8_375b_4301_b79c_4fb7167c683c.slice/crio-23746ff362596935eee812208b9cc2083fbf4dfa049c1aca3d44e4e768206b50 WatchSource:0}: Error finding container 23746ff362596935eee812208b9cc2083fbf4dfa049c1aca3d44e4e768206b50: Status 404 returned error can't find the container with id 23746ff362596935eee812208b9cc2083fbf4dfa049c1aca3d44e4e768206b50 Mar 11 09:18:36 crc kubenswrapper[4830]: I0311 09:18:36.591110 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" event={"ID":"17a24e6e-443e-4561-ac4e-22f51f76b4d4","Type":"ContainerStarted","Data":"ec1249555f4ce5058499751ddaf4f739ae04ce520130ef40c8d40e9d911fb296"} Mar 11 09:18:36 crc kubenswrapper[4830]: I0311 09:18:36.592232 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" event={"ID":"5d5eb9d8-375b-4301-b79c-4fb7167c683c","Type":"ContainerStarted","Data":"23746ff362596935eee812208b9cc2083fbf4dfa049c1aca3d44e4e768206b50"} Mar 11 09:18:36 crc kubenswrapper[4830]: I0311 09:18:36.638655 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.604333 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" event={"ID":"17a24e6e-443e-4561-ac4e-22f51f76b4d4","Type":"ContainerStarted","Data":"b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744"} Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.604621 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.605935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" event={"ID":"5d5eb9d8-375b-4301-b79c-4fb7167c683c","Type":"ContainerStarted","Data":"7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e"} Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.606250 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.610812 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.610882 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.627519 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" podStartSLOduration=7.627505175 podStartE2EDuration="7.627505175s" podCreationTimestamp="2026-03-11 09:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:37.618647403 +0000 UTC m=+285.399798112" watchObservedRunningTime="2026-03-11 09:18:37.627505175 +0000 UTC m=+285.408655864" Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.669817 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" podStartSLOduration=7.66979752 podStartE2EDuration="7.66979752s" podCreationTimestamp="2026-03-11 09:18:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:37.66522622 +0000 UTC m=+285.446376909" watchObservedRunningTime="2026-03-11 09:18:37.66979752 +0000 UTC m=+285.450948209" Mar 11 09:18:37 crc kubenswrapper[4830]: I0311 09:18:37.677221 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngwt4"] Mar 11 09:18:38 crc kubenswrapper[4830]: I0311 09:18:38.611177 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ngwt4" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="registry-server" containerID="cri-o://c7e98b863bd7960e112ea6ce3c201c7bbb5b86b55dd2feede7382ee6727a58f5" gracePeriod=2 Mar 11 09:18:39 crc kubenswrapper[4830]: I0311 09:18:39.618704 4830 generic.go:334] "Generic (PLEG): container finished" podID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerID="c7e98b863bd7960e112ea6ce3c201c7bbb5b86b55dd2feede7382ee6727a58f5" exitCode=0 Mar 11 09:18:39 crc kubenswrapper[4830]: I0311 09:18:39.618779 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngwt4" event={"ID":"63e229ca-0853-43c9-8ad6-a5e236df0812","Type":"ContainerDied","Data":"c7e98b863bd7960e112ea6ce3c201c7bbb5b86b55dd2feede7382ee6727a58f5"} Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.202483 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.361820 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr87t\" (UniqueName: \"kubernetes.io/projected/63e229ca-0853-43c9-8ad6-a5e236df0812-kube-api-access-tr87t\") pod \"63e229ca-0853-43c9-8ad6-a5e236df0812\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.361917 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-catalog-content\") pod \"63e229ca-0853-43c9-8ad6-a5e236df0812\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.361989 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-utilities\") pod \"63e229ca-0853-43c9-8ad6-a5e236df0812\" (UID: \"63e229ca-0853-43c9-8ad6-a5e236df0812\") " Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.362782 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-utilities" (OuterVolumeSpecName: "utilities") pod "63e229ca-0853-43c9-8ad6-a5e236df0812" (UID: "63e229ca-0853-43c9-8ad6-a5e236df0812"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.369934 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e229ca-0853-43c9-8ad6-a5e236df0812-kube-api-access-tr87t" (OuterVolumeSpecName: "kube-api-access-tr87t") pod "63e229ca-0853-43c9-8ad6-a5e236df0812" (UID: "63e229ca-0853-43c9-8ad6-a5e236df0812"). InnerVolumeSpecName "kube-api-access-tr87t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.391078 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63e229ca-0853-43c9-8ad6-a5e236df0812" (UID: "63e229ca-0853-43c9-8ad6-a5e236df0812"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.463954 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.463990 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr87t\" (UniqueName: \"kubernetes.io/projected/63e229ca-0853-43c9-8ad6-a5e236df0812-kube-api-access-tr87t\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.464001 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63e229ca-0853-43c9-8ad6-a5e236df0812-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.639833 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngwt4" event={"ID":"63e229ca-0853-43c9-8ad6-a5e236df0812","Type":"ContainerDied","Data":"8f469d0ac98afa17f7a8a49000ff6c2d89f9ca476f952c1caa69989804c3daf3"} Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.639903 4830 scope.go:117] "RemoveContainer" containerID="c7e98b863bd7960e112ea6ce3c201c7bbb5b86b55dd2feede7382ee6727a58f5" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.639979 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngwt4" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.643293 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ctcw6" event={"ID":"2438f79c-45d2-4b4f-951b-630d3fb2c740","Type":"ContainerStarted","Data":"d1b1085e5fe00d505c1be4487326aac211bdf6d3493086835a5738984048e1fd"} Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.645413 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6st5" event={"ID":"b6155028-4ba3-48be-b83d-7bbe65f28ba7","Type":"ContainerStarted","Data":"f82588298a3aa4fe180efb0045e87538178318511717cec88b7ce49535e6395e"} Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.647436 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbnm8" event={"ID":"0110c11f-22d8-4a16-b11b-08c46b6a3bed","Type":"ContainerStarted","Data":"b4ae693382e19fbb673451f687d6d47aa327485ebba1429a57b9dd21e10fa94c"} Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.661648 4830 scope.go:117] "RemoveContainer" containerID="3af0514b220b7146be1d0c827c2887bc089d1d948456c5abcad32e2dafa02160" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.695751 4830 scope.go:117] "RemoveContainer" containerID="1491318bbb0544d0e7220781dc57b920da53ca22b89cd99636c7e93212f442b0" Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.728241 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngwt4"] Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.731742 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngwt4"] Mar 11 09:18:42 crc kubenswrapper[4830]: I0311 09:18:42.942336 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" path="/var/lib/kubelet/pods/63e229ca-0853-43c9-8ad6-a5e236df0812/volumes" Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.060964 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.061042 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.061096 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.061624 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.061677 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241" gracePeriod=600 Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.689375 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvk9l" event={"ID":"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66","Type":"ContainerStarted","Data":"46268d804b43a32fde6eadd0b714f77921f8879c987da3eb3a9f7191cea26567"} Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.692166 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241" exitCode=0 Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.692229 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241"} Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.699943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9zpp" event={"ID":"7febf059-370d-4a68-a543-3b23879ba479","Type":"ContainerStarted","Data":"e6f5008ca1aef966c2e2824f61827c11e1daf475cf560fb0f418531d4342f63f"} Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.702137 4830 generic.go:334] "Generic (PLEG): container finished" podID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerID="f82588298a3aa4fe180efb0045e87538178318511717cec88b7ce49535e6395e" exitCode=0 Mar 11 09:18:43 crc kubenswrapper[4830]: I0311 09:18:43.702730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6st5" event={"ID":"b6155028-4ba3-48be-b83d-7bbe65f28ba7","Type":"ContainerDied","Data":"f82588298a3aa4fe180efb0045e87538178318511717cec88b7ce49535e6395e"} Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.738603 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"63a6a7ac35cab5a7e04856357852e727639b515067afed5509adcf1a3d3bc6f0"} Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.741378 4830 generic.go:334] "Generic (PLEG): container finished" podID="7febf059-370d-4a68-a543-3b23879ba479" containerID="e6f5008ca1aef966c2e2824f61827c11e1daf475cf560fb0f418531d4342f63f" exitCode=0 Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.741437 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9zpp" event={"ID":"7febf059-370d-4a68-a543-3b23879ba479","Type":"ContainerDied","Data":"e6f5008ca1aef966c2e2824f61827c11e1daf475cf560fb0f418531d4342f63f"} Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.745435 4830 generic.go:334] "Generic (PLEG): container finished" podID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerID="d1b1085e5fe00d505c1be4487326aac211bdf6d3493086835a5738984048e1fd" exitCode=0 Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.745519 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ctcw6" event={"ID":"2438f79c-45d2-4b4f-951b-630d3fb2c740","Type":"ContainerDied","Data":"d1b1085e5fe00d505c1be4487326aac211bdf6d3493086835a5738984048e1fd"} Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.751343 4830 generic.go:334] "Generic (PLEG): container finished" podID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerID="b4ae693382e19fbb673451f687d6d47aa327485ebba1429a57b9dd21e10fa94c" exitCode=0 Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.751419 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbnm8" event={"ID":"0110c11f-22d8-4a16-b11b-08c46b6a3bed","Type":"ContainerDied","Data":"b4ae693382e19fbb673451f687d6d47aa327485ebba1429a57b9dd21e10fa94c"} Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.755640 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerID="46268d804b43a32fde6eadd0b714f77921f8879c987da3eb3a9f7191cea26567" exitCode=0 Mar 11 09:18:44 crc kubenswrapper[4830]: I0311 09:18:44.755683 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvk9l" event={"ID":"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66","Type":"ContainerDied","Data":"46268d804b43a32fde6eadd0b714f77921f8879c987da3eb3a9f7191cea26567"} Mar 11 09:18:44 crc kubenswrapper[4830]: E0311 09:18:44.856598 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7febf059_370d_4a68_a543_3b23879ba479.slice/crio-conmon-e6f5008ca1aef966c2e2824f61827c11e1daf475cf560fb0f418531d4342f63f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-conmon-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fd7ddf_f544_4e6c_bcb6_1f1805c6570b.slice/crio-a7f6316e6d5e00e9b10e9d80ec73988f10a11b0898238a55361597a16cc323c8.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:18:45 crc kubenswrapper[4830]: I0311 09:18:45.387827 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:18:46 crc kubenswrapper[4830]: I0311 09:18:46.780711 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6st5" event={"ID":"b6155028-4ba3-48be-b83d-7bbe65f28ba7","Type":"ContainerStarted","Data":"9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274"} Mar 11 09:18:46 crc kubenswrapper[4830]: I0311 09:18:46.785267 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvk9l" event={"ID":"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66","Type":"ContainerStarted","Data":"16e3013e0dbd1711bcc7b03d12c813e4336678076e4d6e8208680fb194a08a75"} Mar 11 09:18:46 crc kubenswrapper[4830]: I0311 09:18:46.791273 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9zpp" event={"ID":"7febf059-370d-4a68-a543-3b23879ba479","Type":"ContainerStarted","Data":"4fabe6c2638348f3f715899a472a45c820512f348470217255228292ad020a41"} Mar 11 09:18:46 crc kubenswrapper[4830]: I0311 09:18:46.796408 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnft" event={"ID":"ebefef77-9e3b-45d5-8301-53df1b75c9bb","Type":"ContainerStarted","Data":"4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd"} Mar 11 09:18:46 crc kubenswrapper[4830]: I0311 09:18:46.809407 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vvk9l" podStartSLOduration=3.526661215 podStartE2EDuration="54.809389775s" podCreationTimestamp="2026-03-11 09:17:52 +0000 UTC" firstStartedPulling="2026-03-11 09:17:54.90787776 +0000 UTC m=+242.689028449" lastFinishedPulling="2026-03-11 09:18:46.19060631 +0000 UTC m=+293.971757009" observedRunningTime="2026-03-11 09:18:46.806519964 +0000 UTC m=+294.587670673" watchObservedRunningTime="2026-03-11 09:18:46.809389775 +0000 UTC m=+294.590540464" Mar 11 09:18:46 crc kubenswrapper[4830]: I0311 09:18:46.825540 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6st5" podStartSLOduration=3.9449648010000002 podStartE2EDuration="54.825525155s" podCreationTimestamp="2026-03-11 09:17:52 +0000 UTC" firstStartedPulling="2026-03-11 09:17:54.898483806 +0000 UTC m=+242.679634495" lastFinishedPulling="2026-03-11 09:18:45.77904415 +0000 UTC m=+293.560194849" observedRunningTime="2026-03-11 09:18:46.82534443 +0000 UTC m=+294.606495139" watchObservedRunningTime="2026-03-11 09:18:46.825525155 +0000 UTC m=+294.606675844" Mar 11 09:18:46 crc kubenswrapper[4830]: I0311 09:18:46.843566 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x9zpp" podStartSLOduration=2.58742423 podStartE2EDuration="53.843549269s" podCreationTimestamp="2026-03-11 09:17:53 +0000 UTC" firstStartedPulling="2026-03-11 09:17:54.910708963 +0000 UTC m=+242.691859662" lastFinishedPulling="2026-03-11 09:18:46.166834002 +0000 UTC m=+293.947984701" observedRunningTime="2026-03-11 09:18:46.841516651 +0000 UTC m=+294.622667360" watchObservedRunningTime="2026-03-11 09:18:46.843549269 +0000 UTC m=+294.624699958" Mar 11 09:18:47 crc kubenswrapper[4830]: I0311 09:18:47.806072 4830 generic.go:334] "Generic (PLEG): container finished" podID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerID="4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd" exitCode=0 Mar 11 09:18:47 crc kubenswrapper[4830]: I0311 09:18:47.806156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnft" event={"ID":"ebefef77-9e3b-45d5-8301-53df1b75c9bb","Type":"ContainerDied","Data":"4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd"} Mar 11 09:18:49 crc kubenswrapper[4830]: I0311 09:18:49.828650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ctcw6" event={"ID":"2438f79c-45d2-4b4f-951b-630d3fb2c740","Type":"ContainerStarted","Data":"63082d06d754858f7393926be38640a7952c235118ca7c7524553c8472b30a58"} Mar 11 09:18:49 crc kubenswrapper[4830]: I0311 09:18:49.852294 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ctcw6" podStartSLOduration=36.82886052 podStartE2EDuration="54.852276949s" podCreationTimestamp="2026-03-11 09:17:55 +0000 UTC" firstStartedPulling="2026-03-11 09:18:30.402359124 +0000 UTC m=+278.183509813" lastFinishedPulling="2026-03-11 09:18:48.425775553 +0000 UTC m=+296.206926242" observedRunningTime="2026-03-11 09:18:49.851813076 +0000 UTC m=+297.632963785" watchObservedRunningTime="2026-03-11 09:18:49.852276949 +0000 UTC m=+297.633427638" Mar 11 09:18:50 crc kubenswrapper[4830]: I0311 09:18:50.732540 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dcc675794-sr9tg"] Mar 11 09:18:50 crc kubenswrapper[4830]: I0311 09:18:50.733222 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" podUID="17a24e6e-443e-4561-ac4e-22f51f76b4d4" containerName="controller-manager" containerID="cri-o://b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744" gracePeriod=30 Mar 11 09:18:50 crc kubenswrapper[4830]: I0311 09:18:50.793711 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9"] Mar 11 09:18:50 crc kubenswrapper[4830]: I0311 09:18:50.793930 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" podUID="5d5eb9d8-375b-4301-b79c-4fb7167c683c" containerName="route-controller-manager" containerID="cri-o://7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e" gracePeriod=30 Mar 11 09:18:50 crc kubenswrapper[4830]: I0311 09:18:50.835907 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbnm8" event={"ID":"0110c11f-22d8-4a16-b11b-08c46b6a3bed","Type":"ContainerStarted","Data":"918f79df8188d8d923d5757298ec1f2159bc23269d63b702d43402890f4b0b33"} Mar 11 09:18:50 crc kubenswrapper[4830]: I0311 09:18:50.838280 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnft" event={"ID":"ebefef77-9e3b-45d5-8301-53df1b75c9bb","Type":"ContainerStarted","Data":"62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40"} Mar 11 09:18:50 crc kubenswrapper[4830]: I0311 09:18:50.854898 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbnm8" podStartSLOduration=35.638690659 podStartE2EDuration="54.854878184s" podCreationTimestamp="2026-03-11 09:17:56 +0000 UTC" firstStartedPulling="2026-03-11 09:18:30.418488083 +0000 UTC m=+278.199638772" lastFinishedPulling="2026-03-11 09:18:49.634675568 +0000 UTC m=+297.415826297" observedRunningTime="2026-03-11 09:18:50.854452142 +0000 UTC m=+298.635602841" watchObservedRunningTime="2026-03-11 09:18:50.854878184 +0000 UTC m=+298.636028883" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.337336 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.343502 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.352270 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbnft" podStartSLOduration=3.6248594240000003 podStartE2EDuration="58.352257639s" podCreationTimestamp="2026-03-11 09:17:53 +0000 UTC" firstStartedPulling="2026-03-11 09:17:54.92295723 +0000 UTC m=+242.704107919" lastFinishedPulling="2026-03-11 09:18:49.650355455 +0000 UTC m=+297.431506134" observedRunningTime="2026-03-11 09:18:50.883364626 +0000 UTC m=+298.664515345" watchObservedRunningTime="2026-03-11 09:18:51.352257639 +0000 UTC m=+299.133408348" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403352 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thb8x\" (UniqueName: \"kubernetes.io/projected/17a24e6e-443e-4561-ac4e-22f51f76b4d4-kube-api-access-thb8x\") pod \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403415 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-client-ca\") pod \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-config\") pod \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403492 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d5eb9d8-375b-4301-b79c-4fb7167c683c-serving-cert\") pod \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403530 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-proxy-ca-bundles\") pod \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-config\") pod \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403613 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-client-ca\") pod \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403658 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5596\" (UniqueName: \"kubernetes.io/projected/5d5eb9d8-375b-4301-b79c-4fb7167c683c-kube-api-access-w5596\") pod \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\" (UID: \"5d5eb9d8-375b-4301-b79c-4fb7167c683c\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.403716 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a24e6e-443e-4561-ac4e-22f51f76b4d4-serving-cert\") pod \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\" (UID: \"17a24e6e-443e-4561-ac4e-22f51f76b4d4\") " Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.404318 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-client-ca" (OuterVolumeSpecName: "client-ca") pod "5d5eb9d8-375b-4301-b79c-4fb7167c683c" (UID: "5d5eb9d8-375b-4301-b79c-4fb7167c683c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.404619 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-config" (OuterVolumeSpecName: "config") pod "5d5eb9d8-375b-4301-b79c-4fb7167c683c" (UID: "5d5eb9d8-375b-4301-b79c-4fb7167c683c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.404664 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "17a24e6e-443e-4561-ac4e-22f51f76b4d4" (UID: "17a24e6e-443e-4561-ac4e-22f51f76b4d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.404748 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "17a24e6e-443e-4561-ac4e-22f51f76b4d4" (UID: "17a24e6e-443e-4561-ac4e-22f51f76b4d4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.404809 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-config" (OuterVolumeSpecName: "config") pod "17a24e6e-443e-4561-ac4e-22f51f76b4d4" (UID: "17a24e6e-443e-4561-ac4e-22f51f76b4d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.414159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5eb9d8-375b-4301-b79c-4fb7167c683c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5d5eb9d8-375b-4301-b79c-4fb7167c683c" (UID: "5d5eb9d8-375b-4301-b79c-4fb7167c683c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.414175 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5eb9d8-375b-4301-b79c-4fb7167c683c-kube-api-access-w5596" (OuterVolumeSpecName: "kube-api-access-w5596") pod "5d5eb9d8-375b-4301-b79c-4fb7167c683c" (UID: "5d5eb9d8-375b-4301-b79c-4fb7167c683c"). InnerVolumeSpecName "kube-api-access-w5596". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.414202 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a24e6e-443e-4561-ac4e-22f51f76b4d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17a24e6e-443e-4561-ac4e-22f51f76b4d4" (UID: "17a24e6e-443e-4561-ac4e-22f51f76b4d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.414263 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a24e6e-443e-4561-ac4e-22f51f76b4d4-kube-api-access-thb8x" (OuterVolumeSpecName: "kube-api-access-thb8x") pod "17a24e6e-443e-4561-ac4e-22f51f76b4d4" (UID: "17a24e6e-443e-4561-ac4e-22f51f76b4d4"). InnerVolumeSpecName "kube-api-access-thb8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.504969 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17a24e6e-443e-4561-ac4e-22f51f76b4d4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505027 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thb8x\" (UniqueName: \"kubernetes.io/projected/17a24e6e-443e-4561-ac4e-22f51f76b4d4-kube-api-access-thb8x\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505041 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505053 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505064 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d5eb9d8-375b-4301-b79c-4fb7167c683c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505075 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505086 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d5eb9d8-375b-4301-b79c-4fb7167c683c-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505096 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17a24e6e-443e-4561-ac4e-22f51f76b4d4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.505107 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5596\" (UniqueName: \"kubernetes.io/projected/5d5eb9d8-375b-4301-b79c-4fb7167c683c-kube-api-access-w5596\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.847434 4830 generic.go:334] "Generic (PLEG): container finished" podID="17a24e6e-443e-4561-ac4e-22f51f76b4d4" containerID="b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744" exitCode=0 Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.847524 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" event={"ID":"17a24e6e-443e-4561-ac4e-22f51f76b4d4","Type":"ContainerDied","Data":"b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744"} Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.847527 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.847558 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dcc675794-sr9tg" event={"ID":"17a24e6e-443e-4561-ac4e-22f51f76b4d4","Type":"ContainerDied","Data":"ec1249555f4ce5058499751ddaf4f739ae04ce520130ef40c8d40e9d911fb296"} Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.847580 4830 scope.go:117] "RemoveContainer" containerID="b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.850603 4830 generic.go:334] "Generic (PLEG): container finished" podID="5d5eb9d8-375b-4301-b79c-4fb7167c683c" containerID="7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e" exitCode=0 Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.850675 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" event={"ID":"5d5eb9d8-375b-4301-b79c-4fb7167c683c","Type":"ContainerDied","Data":"7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e"} Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.850783 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" event={"ID":"5d5eb9d8-375b-4301-b79c-4fb7167c683c","Type":"ContainerDied","Data":"23746ff362596935eee812208b9cc2083fbf4dfa049c1aca3d44e4e768206b50"} Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.850703 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.887562 4830 scope.go:117] "RemoveContainer" containerID="b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744" Mar 11 09:18:51 crc kubenswrapper[4830]: E0311 09:18:51.888257 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744\": container with ID starting with b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744 not found: ID does not exist" containerID="b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.888309 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744"} err="failed to get container status \"b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744\": rpc error: code = NotFound desc = could not find container \"b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744\": container with ID starting with b0c2634e58f41fc56263db31782a1ae560adaabff1f11f4815fce8234fde3744 not found: ID does not exist" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.888339 4830 scope.go:117] "RemoveContainer" containerID="7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.895625 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dcc675794-sr9tg"] Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.901688 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dcc675794-sr9tg"] Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.915987 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9"] Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.918504 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5579ccb9bf-xhqk9"] Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.922131 4830 scope.go:117] "RemoveContainer" containerID="7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e" Mar 11 09:18:51 crc kubenswrapper[4830]: E0311 09:18:51.923790 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e\": container with ID starting with 7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e not found: ID does not exist" containerID="7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e" Mar 11 09:18:51 crc kubenswrapper[4830]: I0311 09:18:51.923827 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e"} err="failed to get container status \"7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e\": rpc error: code = NotFound desc = could not find container \"7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e\": container with ID starting with 7bead016fde6ae87b086e1f4cd29ac01d748fc4d4dd295ccbbfbee9e6509839e not found: ID does not exist" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724080 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs"] Mar 11 09:18:52 crc kubenswrapper[4830]: E0311 09:18:52.724462 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="registry-server" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724493 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="registry-server" Mar 11 09:18:52 crc kubenswrapper[4830]: E0311 09:18:52.724511 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a24e6e-443e-4561-ac4e-22f51f76b4d4" containerName="controller-manager" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724524 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a24e6e-443e-4561-ac4e-22f51f76b4d4" containerName="controller-manager" Mar 11 09:18:52 crc kubenswrapper[4830]: E0311 09:18:52.724553 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="extract-content" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724565 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="extract-content" Mar 11 09:18:52 crc kubenswrapper[4830]: E0311 09:18:52.724580 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5eb9d8-375b-4301-b79c-4fb7167c683c" containerName="route-controller-manager" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724593 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5eb9d8-375b-4301-b79c-4fb7167c683c" containerName="route-controller-manager" Mar 11 09:18:52 crc kubenswrapper[4830]: E0311 09:18:52.724617 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="extract-utilities" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724629 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="extract-utilities" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724830 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a24e6e-443e-4561-ac4e-22f51f76b4d4" containerName="controller-manager" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724856 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5eb9d8-375b-4301-b79c-4fb7167c683c" containerName="route-controller-manager" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.724883 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e229ca-0853-43c9-8ad6-a5e236df0812" containerName="registry-server" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.725604 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.728200 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.728837 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.729277 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.729530 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.730425 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n"] Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.733635 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.734031 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.737822 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.739087 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.739283 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.739347 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.739593 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.743608 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.750884 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs"] Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.751576 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.756364 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.763235 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n"] Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.922823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf6e34a1-b355-4d79-ba6a-2a63673342ae-serving-cert\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923189 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-config\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923235 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-proxy-ca-bundles\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqd6k\" (UniqueName: \"kubernetes.io/projected/cf6e34a1-b355-4d79-ba6a-2a63673342ae-kube-api-access-wqd6k\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923284 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-client-ca\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923304 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-serving-cert\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923322 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-client-ca\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923362 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-config\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.923393 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpzpb\" (UniqueName: \"kubernetes.io/projected/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-kube-api-access-hpzpb\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.940503 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a24e6e-443e-4561-ac4e-22f51f76b4d4" path="/var/lib/kubelet/pods/17a24e6e-443e-4561-ac4e-22f51f76b4d4/volumes" Mar 11 09:18:52 crc kubenswrapper[4830]: I0311 09:18:52.941303 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5eb9d8-375b-4301-b79c-4fb7167c683c" path="/var/lib/kubelet/pods/5d5eb9d8-375b-4301-b79c-4fb7167c683c/volumes" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.007683 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.007738 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024302 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-config\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024370 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-proxy-ca-bundles\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024390 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqd6k\" (UniqueName: \"kubernetes.io/projected/cf6e34a1-b355-4d79-ba6a-2a63673342ae-kube-api-access-wqd6k\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-client-ca\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024452 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-serving-cert\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024475 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-client-ca\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024528 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-config\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024567 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzpb\" (UniqueName: \"kubernetes.io/projected/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-kube-api-access-hpzpb\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.024599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf6e34a1-b355-4d79-ba6a-2a63673342ae-serving-cert\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.026339 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.026871 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.027083 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.027283 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.028131 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.033897 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.034397 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.035711 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-client-ca\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.035977 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-proxy-ca-bundles\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.037294 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-config\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.037761 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-config\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.038498 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-client-ca\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.042574 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-serving-cert\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.042906 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.043798 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.044527 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf6e34a1-b355-4d79-ba6a-2a63673342ae-serving-cert\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.055523 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.055720 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.059401 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.065303 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqd6k\" (UniqueName: \"kubernetes.io/projected/cf6e34a1-b355-4d79-ba6a-2a63673342ae-kube-api-access-wqd6k\") pod \"controller-manager-5cdb8c54d-wrjbs\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.069460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpzpb\" (UniqueName: \"kubernetes.io/projected/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-kube-api-access-hpzpb\") pod \"route-controller-manager-58b6788979-g8q9n\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.072854 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.081941 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.357763 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.367287 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.415526 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.415582 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.471520 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.529852 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n"] Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.544098 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs"] Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.553065 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.553105 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:18:53 crc kubenswrapper[4830]: W0311 09:18:53.553425 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf6e34a1_b355_4d79_ba6a_2a63673342ae.slice/crio-0792b10dafa7c0d2a294707b25ed247d5d7c8330b3c8056decb0b4c4ab2ead46 WatchSource:0}: Error finding container 0792b10dafa7c0d2a294707b25ed247d5d7c8330b3c8056decb0b4c4ab2ead46: Status 404 returned error can't find the container with id 0792b10dafa7c0d2a294707b25ed247d5d7c8330b3c8056decb0b4c4ab2ead46 Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.596151 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.780054 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.780102 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.837638 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.865827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" event={"ID":"b4d7ac84-63c4-4475-aaf0-6eacd3e788be","Type":"ContainerStarted","Data":"78a7efac40dab5d41369a67fbb1c4f72f72c28642c6192e8b8fbedb3308e1d6a"} Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.865884 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" event={"ID":"b4d7ac84-63c4-4475-aaf0-6eacd3e788be","Type":"ContainerStarted","Data":"b41ebdefdff0e02f3a090351c62a0e8f4a3a4d5171f9dae1c1204b4b164e004a"} Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.867203 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.868360 4830 patch_prober.go:28] interesting pod/route-controller-manager-58b6788979-g8q9n container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.868397 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" podUID="b4d7ac84-63c4-4475-aaf0-6eacd3e788be" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.869421 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" event={"ID":"cf6e34a1-b355-4d79-ba6a-2a63673342ae","Type":"ContainerStarted","Data":"35f5f9f2ab5a546eee2a87125843f65b1c849e34a83de76ddec580fa09941155"} Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.869486 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" event={"ID":"cf6e34a1-b355-4d79-ba6a-2a63673342ae","Type":"ContainerStarted","Data":"0792b10dafa7c0d2a294707b25ed247d5d7c8330b3c8056decb0b4c4ab2ead46"} Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.870385 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.878882 4830 patch_prober.go:28] interesting pod/controller-manager-5cdb8c54d-wrjbs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.878944 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" podUID="cf6e34a1-b355-4d79-ba6a-2a63673342ae" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.885594 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" podStartSLOduration=3.885574971 podStartE2EDuration="3.885574971s" podCreationTimestamp="2026-03-11 09:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:53.881946168 +0000 UTC m=+301.663096887" watchObservedRunningTime="2026-03-11 09:18:53.885574971 +0000 UTC m=+301.666725670" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.912294 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" podStartSLOduration=3.912272752 podStartE2EDuration="3.912272752s" podCreationTimestamp="2026-03-11 09:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:18:53.90692393 +0000 UTC m=+301.688074629" watchObservedRunningTime="2026-03-11 09:18:53.912272752 +0000 UTC m=+301.693423441" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.925083 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.927697 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:18:53 crc kubenswrapper[4830]: I0311 09:18:53.928461 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:18:54 crc kubenswrapper[4830]: I0311 09:18:54.881354 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:18:55 crc kubenswrapper[4830]: I0311 09:18:55.106209 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:18:56 crc kubenswrapper[4830]: I0311 09:18:56.321849 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:18:56 crc kubenswrapper[4830]: I0311 09:18:56.323097 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:18:56 crc kubenswrapper[4830]: I0311 09:18:56.745306 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:18:56 crc kubenswrapper[4830]: I0311 09:18:56.745484 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:18:56 crc kubenswrapper[4830]: I0311 09:18:56.868956 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvk9l"] Mar 11 09:18:56 crc kubenswrapper[4830]: I0311 09:18:56.869895 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vvk9l" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="registry-server" containerID="cri-o://16e3013e0dbd1711bcc7b03d12c813e4336678076e4d6e8208680fb194a08a75" gracePeriod=2 Mar 11 09:18:57 crc kubenswrapper[4830]: I0311 09:18:57.384152 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ctcw6" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="registry-server" probeResult="failure" output=< Mar 11 09:18:57 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 09:18:57 crc kubenswrapper[4830]: > Mar 11 09:18:57 crc kubenswrapper[4830]: I0311 09:18:57.790332 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbnm8" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="registry-server" probeResult="failure" output=< Mar 11 09:18:57 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 09:18:57 crc kubenswrapper[4830]: > Mar 11 09:18:57 crc kubenswrapper[4830]: I0311 09:18:57.900104 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerID="16e3013e0dbd1711bcc7b03d12c813e4336678076e4d6e8208680fb194a08a75" exitCode=0 Mar 11 09:18:57 crc kubenswrapper[4830]: I0311 09:18:57.900149 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvk9l" event={"ID":"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66","Type":"ContainerDied","Data":"16e3013e0dbd1711bcc7b03d12c813e4336678076e4d6e8208680fb194a08a75"} Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.848673 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.933545 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvk9l" Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.944314 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvk9l" event={"ID":"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66","Type":"ContainerDied","Data":"6f175126e09e5cf12dbcfd97ea672787cb3bfbe875baab793fd8b58265df3c80"} Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.944370 4830 scope.go:117] "RemoveContainer" containerID="16e3013e0dbd1711bcc7b03d12c813e4336678076e4d6e8208680fb194a08a75" Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.945051 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cnkj\" (UniqueName: \"kubernetes.io/projected/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-kube-api-access-8cnkj\") pod \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.945147 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-catalog-content\") pod \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.945177 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-utilities\") pod \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\" (UID: \"e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66\") " Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.946274 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-utilities" (OuterVolumeSpecName: "utilities") pod "e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" (UID: "e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.952579 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-kube-api-access-8cnkj" (OuterVolumeSpecName: "kube-api-access-8cnkj") pod "e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" (UID: "e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66"). InnerVolumeSpecName "kube-api-access-8cnkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.973231 4830 scope.go:117] "RemoveContainer" containerID="46268d804b43a32fde6eadd0b714f77921f8879c987da3eb3a9f7191cea26567" Mar 11 09:18:58 crc kubenswrapper[4830]: I0311 09:18:58.998263 4830 scope.go:117] "RemoveContainer" containerID="aa812bf6f24f9b54d10d6274e6fec320b59f6ea957d66252224ff0cf8039d3d1" Mar 11 09:18:59 crc kubenswrapper[4830]: I0311 09:18:59.006000 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" (UID: "e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:18:59 crc kubenswrapper[4830]: I0311 09:18:59.046737 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cnkj\" (UniqueName: \"kubernetes.io/projected/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-kube-api-access-8cnkj\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:59 crc kubenswrapper[4830]: I0311 09:18:59.046785 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:59 crc kubenswrapper[4830]: I0311 09:18:59.046798 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:18:59 crc kubenswrapper[4830]: I0311 09:18:59.274341 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvk9l"] Mar 11 09:18:59 crc kubenswrapper[4830]: I0311 09:18:59.278089 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vvk9l"] Mar 11 09:19:00 crc kubenswrapper[4830]: I0311 09:19:00.948665 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" path="/var/lib/kubelet/pods/e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66/volumes" Mar 11 09:19:03 crc kubenswrapper[4830]: I0311 09:19:03.840363 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:19:04 crc kubenswrapper[4830]: I0311 09:19:04.958263 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbnft"] Mar 11 09:19:04 crc kubenswrapper[4830]: I0311 09:19:04.958756 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wbnft" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="registry-server" containerID="cri-o://62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40" gracePeriod=2 Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.524038 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.653284 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhwgx\" (UniqueName: \"kubernetes.io/projected/ebefef77-9e3b-45d5-8301-53df1b75c9bb-kube-api-access-zhwgx\") pod \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.653356 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-catalog-content\") pod \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.653426 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-utilities\") pod \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\" (UID: \"ebefef77-9e3b-45d5-8301-53df1b75c9bb\") " Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.654342 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-utilities" (OuterVolumeSpecName: "utilities") pod "ebefef77-9e3b-45d5-8301-53df1b75c9bb" (UID: "ebefef77-9e3b-45d5-8301-53df1b75c9bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.659146 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebefef77-9e3b-45d5-8301-53df1b75c9bb-kube-api-access-zhwgx" (OuterVolumeSpecName: "kube-api-access-zhwgx") pod "ebefef77-9e3b-45d5-8301-53df1b75c9bb" (UID: "ebefef77-9e3b-45d5-8301-53df1b75c9bb"). InnerVolumeSpecName "kube-api-access-zhwgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.725205 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebefef77-9e3b-45d5-8301-53df1b75c9bb" (UID: "ebefef77-9e3b-45d5-8301-53df1b75c9bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.756599 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhwgx\" (UniqueName: \"kubernetes.io/projected/ebefef77-9e3b-45d5-8301-53df1b75c9bb-kube-api-access-zhwgx\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.756731 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.756749 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebefef77-9e3b-45d5-8301-53df1b75c9bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.998303 4830 generic.go:334] "Generic (PLEG): container finished" podID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerID="62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40" exitCode=0 Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.998355 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnft" event={"ID":"ebefef77-9e3b-45d5-8301-53df1b75c9bb","Type":"ContainerDied","Data":"62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40"} Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.998405 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbnft" event={"ID":"ebefef77-9e3b-45d5-8301-53df1b75c9bb","Type":"ContainerDied","Data":"7f51526411ec0eb6ddcfa211e449953346c2a629a9b3d0a658a6a579cae72f1c"} Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.998435 4830 scope.go:117] "RemoveContainer" containerID="62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40" Mar 11 09:19:05 crc kubenswrapper[4830]: I0311 09:19:05.998444 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbnft" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.029924 4830 scope.go:117] "RemoveContainer" containerID="4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.032848 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbnft"] Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.040916 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wbnft"] Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.046227 4830 scope.go:117] "RemoveContainer" containerID="a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.088107 4830 scope.go:117] "RemoveContainer" containerID="62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40" Mar 11 09:19:06 crc kubenswrapper[4830]: E0311 09:19:06.092576 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40\": container with ID starting with 62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40 not found: ID does not exist" containerID="62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.092667 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40"} err="failed to get container status \"62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40\": rpc error: code = NotFound desc = could not find container \"62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40\": container with ID starting with 62f4cbfa487be03a6c036f182353da9a125c961a2d165385dc79f88b32726c40 not found: ID does not exist" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.092708 4830 scope.go:117] "RemoveContainer" containerID="4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd" Mar 11 09:19:06 crc kubenswrapper[4830]: E0311 09:19:06.093249 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd\": container with ID starting with 4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd not found: ID does not exist" containerID="4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.093318 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd"} err="failed to get container status \"4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd\": rpc error: code = NotFound desc = could not find container \"4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd\": container with ID starting with 4c043708fe5794c62fa86d4b448ca84302d2d56f210b8a70bb68dc02065563cd not found: ID does not exist" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.093370 4830 scope.go:117] "RemoveContainer" containerID="a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7" Mar 11 09:19:06 crc kubenswrapper[4830]: E0311 09:19:06.093671 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7\": container with ID starting with a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7 not found: ID does not exist" containerID="a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.093702 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7"} err="failed to get container status \"a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7\": rpc error: code = NotFound desc = could not find container \"a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7\": container with ID starting with a06963a7343d3a02cafe803f153b2158f895ca42f9860393a421f4cdd054b0f7 not found: ID does not exist" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.210932 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-k86x2"] Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.367330 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.411875 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.797457 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.845604 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:19:06 crc kubenswrapper[4830]: I0311 09:19:06.942181 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" path="/var/lib/kubelet/pods/ebefef77-9e3b-45d5-8301-53df1b75c9bb/volumes" Mar 11 09:19:08 crc kubenswrapper[4830]: I0311 09:19:08.753221 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbnm8"] Mar 11 09:19:08 crc kubenswrapper[4830]: I0311 09:19:08.753545 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbnm8" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="registry-server" containerID="cri-o://918f79df8188d8d923d5757298ec1f2159bc23269d63b702d43402890f4b0b33" gracePeriod=2 Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.022611 4830 generic.go:334] "Generic (PLEG): container finished" podID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerID="918f79df8188d8d923d5757298ec1f2159bc23269d63b702d43402890f4b0b33" exitCode=0 Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.023072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbnm8" event={"ID":"0110c11f-22d8-4a16-b11b-08c46b6a3bed","Type":"ContainerDied","Data":"918f79df8188d8d923d5757298ec1f2159bc23269d63b702d43402890f4b0b33"} Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.374315 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.513509 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-utilities\") pod \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.513668 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-catalog-content\") pod \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.513807 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clw26\" (UniqueName: \"kubernetes.io/projected/0110c11f-22d8-4a16-b11b-08c46b6a3bed-kube-api-access-clw26\") pod \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\" (UID: \"0110c11f-22d8-4a16-b11b-08c46b6a3bed\") " Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.522126 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-utilities" (OuterVolumeSpecName: "utilities") pod "0110c11f-22d8-4a16-b11b-08c46b6a3bed" (UID: "0110c11f-22d8-4a16-b11b-08c46b6a3bed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.523829 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0110c11f-22d8-4a16-b11b-08c46b6a3bed-kube-api-access-clw26" (OuterVolumeSpecName: "kube-api-access-clw26") pod "0110c11f-22d8-4a16-b11b-08c46b6a3bed" (UID: "0110c11f-22d8-4a16-b11b-08c46b6a3bed"). InnerVolumeSpecName "kube-api-access-clw26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.615510 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clw26\" (UniqueName: \"kubernetes.io/projected/0110c11f-22d8-4a16-b11b-08c46b6a3bed-kube-api-access-clw26\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.615562 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.686754 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0110c11f-22d8-4a16-b11b-08c46b6a3bed" (UID: "0110c11f-22d8-4a16-b11b-08c46b6a3bed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:19:09 crc kubenswrapper[4830]: I0311 09:19:09.717145 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0110c11f-22d8-4a16-b11b-08c46b6a3bed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.041574 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbnm8" event={"ID":"0110c11f-22d8-4a16-b11b-08c46b6a3bed","Type":"ContainerDied","Data":"3ec669405684d95fe467a6b375592fd74e97daaf4b56f1448c25a983f513d9b4"} Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.041640 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbnm8" Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.042434 4830 scope.go:117] "RemoveContainer" containerID="918f79df8188d8d923d5757298ec1f2159bc23269d63b702d43402890f4b0b33" Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.057724 4830 scope.go:117] "RemoveContainer" containerID="b4ae693382e19fbb673451f687d6d47aa327485ebba1429a57b9dd21e10fa94c" Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.068394 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbnm8"] Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.072151 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbnm8"] Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.073916 4830 scope.go:117] "RemoveContainer" containerID="63db21c8ead7380860951efa76345f6bc5d2abfc3dd94047883a131e5ad3f456" Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.722389 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs"] Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.722799 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" podUID="cf6e34a1-b355-4d79-ba6a-2a63673342ae" containerName="controller-manager" containerID="cri-o://35f5f9f2ab5a546eee2a87125843f65b1c849e34a83de76ddec580fa09941155" gracePeriod=30 Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.810926 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n"] Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.811469 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" podUID="b4d7ac84-63c4-4475-aaf0-6eacd3e788be" containerName="route-controller-manager" containerID="cri-o://78a7efac40dab5d41369a67fbb1c4f72f72c28642c6192e8b8fbedb3308e1d6a" gracePeriod=30 Mar 11 09:19:10 crc kubenswrapper[4830]: I0311 09:19:10.945295 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" path="/var/lib/kubelet/pods/0110c11f-22d8-4a16-b11b-08c46b6a3bed/volumes" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.048114 4830 generic.go:334] "Generic (PLEG): container finished" podID="cf6e34a1-b355-4d79-ba6a-2a63673342ae" containerID="35f5f9f2ab5a546eee2a87125843f65b1c849e34a83de76ddec580fa09941155" exitCode=0 Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.048205 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" event={"ID":"cf6e34a1-b355-4d79-ba6a-2a63673342ae","Type":"ContainerDied","Data":"35f5f9f2ab5a546eee2a87125843f65b1c849e34a83de76ddec580fa09941155"} Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.050123 4830 generic.go:334] "Generic (PLEG): container finished" podID="b4d7ac84-63c4-4475-aaf0-6eacd3e788be" containerID="78a7efac40dab5d41369a67fbb1c4f72f72c28642c6192e8b8fbedb3308e1d6a" exitCode=0 Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.050169 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" event={"ID":"b4d7ac84-63c4-4475-aaf0-6eacd3e788be","Type":"ContainerDied","Data":"78a7efac40dab5d41369a67fbb1c4f72f72c28642c6192e8b8fbedb3308e1d6a"} Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.311880 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.315873 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.440778 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-config\") pod \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.440871 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-serving-cert\") pod \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.440900 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-client-ca\") pod \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.440958 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-proxy-ca-bundles\") pod \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.440993 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpzpb\" (UniqueName: \"kubernetes.io/projected/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-kube-api-access-hpzpb\") pod \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.441044 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqd6k\" (UniqueName: \"kubernetes.io/projected/cf6e34a1-b355-4d79-ba6a-2a63673342ae-kube-api-access-wqd6k\") pod \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.441077 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-config\") pod \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.441122 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf6e34a1-b355-4d79-ba6a-2a63673342ae-serving-cert\") pod \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\" (UID: \"cf6e34a1-b355-4d79-ba6a-2a63673342ae\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.441176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-client-ca\") pod \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\" (UID: \"b4d7ac84-63c4-4475-aaf0-6eacd3e788be\") " Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.441971 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cf6e34a1-b355-4d79-ba6a-2a63673342ae" (UID: "cf6e34a1-b355-4d79-ba6a-2a63673342ae"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.441998 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf6e34a1-b355-4d79-ba6a-2a63673342ae" (UID: "cf6e34a1-b355-4d79-ba6a-2a63673342ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.442172 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-client-ca" (OuterVolumeSpecName: "client-ca") pod "b4d7ac84-63c4-4475-aaf0-6eacd3e788be" (UID: "b4d7ac84-63c4-4475-aaf0-6eacd3e788be"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.442417 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-config" (OuterVolumeSpecName: "config") pod "cf6e34a1-b355-4d79-ba6a-2a63673342ae" (UID: "cf6e34a1-b355-4d79-ba6a-2a63673342ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.442562 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-config" (OuterVolumeSpecName: "config") pod "b4d7ac84-63c4-4475-aaf0-6eacd3e788be" (UID: "b4d7ac84-63c4-4475-aaf0-6eacd3e788be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.447208 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-kube-api-access-hpzpb" (OuterVolumeSpecName: "kube-api-access-hpzpb") pod "b4d7ac84-63c4-4475-aaf0-6eacd3e788be" (UID: "b4d7ac84-63c4-4475-aaf0-6eacd3e788be"). InnerVolumeSpecName "kube-api-access-hpzpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.447408 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6e34a1-b355-4d79-ba6a-2a63673342ae-kube-api-access-wqd6k" (OuterVolumeSpecName: "kube-api-access-wqd6k") pod "cf6e34a1-b355-4d79-ba6a-2a63673342ae" (UID: "cf6e34a1-b355-4d79-ba6a-2a63673342ae"). InnerVolumeSpecName "kube-api-access-wqd6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.447776 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b4d7ac84-63c4-4475-aaf0-6eacd3e788be" (UID: "b4d7ac84-63c4-4475-aaf0-6eacd3e788be"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.447988 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6e34a1-b355-4d79-ba6a-2a63673342ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf6e34a1-b355-4d79-ba6a-2a63673342ae" (UID: "cf6e34a1-b355-4d79-ba6a-2a63673342ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543624 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqd6k\" (UniqueName: \"kubernetes.io/projected/cf6e34a1-b355-4d79-ba6a-2a63673342ae-kube-api-access-wqd6k\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543739 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543760 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf6e34a1-b355-4d79-ba6a-2a63673342ae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543778 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543793 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543808 4830 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543823 4830 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543839 4830 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf6e34a1-b355-4d79-ba6a-2a63673342ae-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:11 crc kubenswrapper[4830]: I0311 09:19:11.543854 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpzpb\" (UniqueName: \"kubernetes.io/projected/b4d7ac84-63c4-4475-aaf0-6eacd3e788be-kube-api-access-hpzpb\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.056657 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" event={"ID":"b4d7ac84-63c4-4475-aaf0-6eacd3e788be","Type":"ContainerDied","Data":"b41ebdefdff0e02f3a090351c62a0e8f4a3a4d5171f9dae1c1204b4b164e004a"} Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.056927 4830 scope.go:117] "RemoveContainer" containerID="78a7efac40dab5d41369a67fbb1c4f72f72c28642c6192e8b8fbedb3308e1d6a" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.056695 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.059011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" event={"ID":"cf6e34a1-b355-4d79-ba6a-2a63673342ae","Type":"ContainerDied","Data":"0792b10dafa7c0d2a294707b25ed247d5d7c8330b3c8056decb0b4c4ab2ead46"} Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.059095 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.084244 4830 scope.go:117] "RemoveContainer" containerID="35f5f9f2ab5a546eee2a87125843f65b1c849e34a83de76ddec580fa09941155" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.087492 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.090905 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58b6788979-g8q9n"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.110469 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.113074 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5cdb8c54d-wrjbs"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.744982 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59574c768d-w7m8t"] Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745341 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="extract-utilities" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745377 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="extract-utilities" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745397 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d7ac84-63c4-4475-aaf0-6eacd3e788be" containerName="route-controller-manager" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745410 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d7ac84-63c4-4475-aaf0-6eacd3e788be" containerName="route-controller-manager" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745429 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745444 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745466 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="extract-content" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745477 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="extract-content" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745497 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="extract-content" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745510 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="extract-content" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745528 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6e34a1-b355-4d79-ba6a-2a63673342ae" containerName="controller-manager" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745539 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6e34a1-b355-4d79-ba6a-2a63673342ae" containerName="controller-manager" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745558 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745570 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745590 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745602 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745623 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="extract-utilities" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745636 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="extract-utilities" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745652 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="extract-utilities" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745666 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="extract-utilities" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.745683 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="extract-content" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745696 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="extract-content" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745851 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6e34a1-b355-4d79-ba6a-2a63673342ae" containerName="controller-manager" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745869 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c6ba43-1cb8-442f-a5f2-b7bac6ed1c66" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745890 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d7ac84-63c4-4475-aaf0-6eacd3e788be" containerName="route-controller-manager" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745912 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebefef77-9e3b-45d5-8301-53df1b75c9bb" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.745931 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0110c11f-22d8-4a16-b11b-08c46b6a3bed" containerName="registry-server" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.746567 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.751532 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.752096 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.753001 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.753385 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.754562 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.754748 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.759310 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.760456 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.766602 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.766680 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.766764 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.766871 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.767119 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.767217 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.767815 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.770243 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.781635 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59574c768d-w7m8t"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.795708 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.796299 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.797285 4830 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.797471 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e" gracePeriod=15 Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.797575 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2" gracePeriod=15 Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.797615 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57" gracePeriod=15 Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.797647 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f" gracePeriod=15 Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.797678 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135" gracePeriod=15 Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800308 4830 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800461 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800473 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800480 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800487 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800514 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800519 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800529 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800535 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800542 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800548 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800557 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800562 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800591 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800597 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800605 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800612 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800711 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800723 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800750 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800757 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800763 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800769 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800776 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800783 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800930 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800939 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.800948 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.800953 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.801099 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.816315 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.861177 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863092 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-client-ca\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863155 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac658d02-afda-445f-9fe1-f276192398d4-client-ca\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac658d02-afda-445f-9fe1-f276192398d4-config\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863212 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-config\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863237 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-proxy-ca-bundles\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863335 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863371 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac658d02-afda-445f-9fe1-f276192398d4-serving-cert\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.863396 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36eeae0e-1131-4f44-a1ab-0a9cd785335d-serving-cert\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.936246 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.941427 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d7ac84-63c4-4475-aaf0-6eacd3e788be" path="/var/lib/kubelet/pods/b4d7ac84-63c4-4475-aaf0-6eacd3e788be/volumes" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.941920 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6e34a1-b355-4d79-ba6a-2a63673342ae" path="/var/lib/kubelet/pods/cf6e34a1-b355-4d79-ba6a-2a63673342ae/volumes" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.964912 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.964987 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965012 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965056 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965082 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac658d02-afda-445f-9fe1-f276192398d4-serving-cert\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965100 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965116 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965131 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36eeae0e-1131-4f44-a1ab-0a9cd785335d-serving-cert\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965155 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965173 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965191 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-client-ca\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965213 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac658d02-afda-445f-9fe1-f276192398d4-client-ca\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965234 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965268 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac658d02-afda-445f-9fe1-f276192398d4-config\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-config\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965297 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.965313 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-proxy-ca-bundles\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.965639 4830 projected.go:194] Error preparing data for projected volume kube-api-access-c958p for pod openshift-controller-manager/controller-manager-59574c768d-w7m8t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.965686 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p podName:36eeae0e-1131-4f44-a1ab-0a9cd785335d nodeName:}" failed. No retries permitted until 2026-03-11 09:19:13.465670507 +0000 UTC m=+321.246821196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c958p" (UniqueName: "kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p") pod "controller-manager-59574c768d-w7m8t" (UID: "36eeae0e-1131-4f44-a1ab-0a9cd785335d") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.965997 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59574c768d-w7m8t.189bbee140de81ad openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59574c768d-w7m8t,UID:36eeae0e-1131-4f44-a1ab-0a9cd785335d,APIVersion:v1,ResourceVersion:29889,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-c958p\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token\": dial tcp 38.102.83.169:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:19:12.965665197 +0000 UTC m=+320.746815886,LastTimestamp:2026-03-11 09:19:12.965665197 +0000 UTC m=+320.746815886,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.966683 4830 projected.go:194] Error preparing data for projected volume kube-api-access-spbtg for pod openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:12 crc kubenswrapper[4830]: E0311 09:19:12.966750 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg podName:ac658d02-afda-445f-9fe1-f276192398d4 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:13.466732177 +0000 UTC m=+321.247882866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-spbtg" (UniqueName: "kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg") pod "route-controller-manager-7fd449d484-7kbdc" (UID: "ac658d02-afda-445f-9fe1-f276192398d4") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.966972 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-proxy-ca-bundles\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.967290 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac658d02-afda-445f-9fe1-f276192398d4-client-ca\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.967425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac658d02-afda-445f-9fe1-f276192398d4-config\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.968303 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-client-ca\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.968715 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36eeae0e-1131-4f44-a1ab-0a9cd785335d-config\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.970497 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36eeae0e-1131-4f44-a1ab-0a9cd785335d-serving-cert\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:12 crc kubenswrapper[4830]: I0311 09:19:12.970504 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac658d02-afda-445f-9fe1-f276192398d4-serving-cert\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.065932 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066005 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066042 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066070 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066074 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066105 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066122 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066140 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066175 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066173 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066202 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066237 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066259 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066274 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066302 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.066323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.067359 4830 generic.go:334] "Generic (PLEG): container finished" podID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" containerID="013768beeee63f95bb4a132abca269d76730d6d2c9c092020140d830294c1f19" exitCode=0 Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.067426 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32","Type":"ContainerDied","Data":"013768beeee63f95bb4a132abca269d76730d6d2c9c092020140d830294c1f19"} Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.068231 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.068745 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.069615 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.071702 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.072578 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2" exitCode=0 Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.072602 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57" exitCode=0 Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.072609 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f" exitCode=0 Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.072616 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135" exitCode=2 Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.072685 4830 scope.go:117] "RemoveContainer" containerID="193f8cc03c4999df46dd28a06cf82b6e04c3a30e8163ac89de8d3eacbe863fb4" Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.149945 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:19:13 crc kubenswrapper[4830]: W0311 09:19:13.171519 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-129222e8cd98e0e8b60e35e2696a7ea0842a0e474e1dbdefd9a2dc2611066ee4 WatchSource:0}: Error finding container 129222e8cd98e0e8b60e35e2696a7ea0842a0e474e1dbdefd9a2dc2611066ee4: Status 404 returned error can't find the container with id 129222e8cd98e0e8b60e35e2696a7ea0842a0e474e1dbdefd9a2dc2611066ee4 Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.470898 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:13 crc kubenswrapper[4830]: E0311 09:19:13.472307 4830 projected.go:194] Error preparing data for projected volume kube-api-access-spbtg for pod openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:13 crc kubenswrapper[4830]: E0311 09:19:13.472407 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg podName:ac658d02-afda-445f-9fe1-f276192398d4 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:14.472377328 +0000 UTC m=+322.253528057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-spbtg" (UniqueName: "kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg") pod "route-controller-manager-7fd449d484-7kbdc" (UID: "ac658d02-afda-445f-9fe1-f276192398d4") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:13 crc kubenswrapper[4830]: I0311 09:19:13.473173 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:13 crc kubenswrapper[4830]: E0311 09:19:13.474056 4830 projected.go:194] Error preparing data for projected volume kube-api-access-c958p for pod openshift-controller-manager/controller-manager-59574c768d-w7m8t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:13 crc kubenswrapper[4830]: E0311 09:19:13.474170 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p podName:36eeae0e-1131-4f44-a1ab-0a9cd785335d nodeName:}" failed. No retries permitted until 2026-03-11 09:19:14.474140868 +0000 UTC m=+322.255291597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c958p" (UniqueName: "kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p") pod "controller-manager-59574c768d-w7m8t" (UID: "36eeae0e-1131-4f44-a1ab-0a9cd785335d") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.080321 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10"} Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.081212 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.081319 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"129222e8cd98e0e8b60e35e2696a7ea0842a0e474e1dbdefd9a2dc2611066ee4"} Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.081731 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.085171 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.380609 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.381325 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.381960 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.486864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kube-api-access\") pod \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487056 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kubelet-dir\") pod \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487148 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-var-lock\") pod \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\" (UID: \"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32\") " Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487227 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" (UID: "ca0a5270-19fb-4d44-8fa5-24c0fd6eed32"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487348 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-var-lock" (OuterVolumeSpecName: "var-lock") pod "ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" (UID: "ca0a5270-19fb-4d44-8fa5-24c0fd6eed32"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487604 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487736 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487837 4830 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.487865 4830 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:14 crc kubenswrapper[4830]: E0311 09:19:14.489838 4830 projected.go:194] Error preparing data for projected volume kube-api-access-spbtg for pod openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:14 crc kubenswrapper[4830]: E0311 09:19:14.489962 4830 projected.go:194] Error preparing data for projected volume kube-api-access-c958p for pod openshift-controller-manager/controller-manager-59574c768d-w7m8t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:14 crc kubenswrapper[4830]: E0311 09:19:14.489986 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg podName:ac658d02-afda-445f-9fe1-f276192398d4 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:16.489949249 +0000 UTC m=+324.271099978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-spbtg" (UniqueName: "kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg") pod "route-controller-manager-7fd449d484-7kbdc" (UID: "ac658d02-afda-445f-9fe1-f276192398d4") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:14 crc kubenswrapper[4830]: E0311 09:19:14.490104 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p podName:36eeae0e-1131-4f44-a1ab-0a9cd785335d nodeName:}" failed. No retries permitted until 2026-03-11 09:19:16.490071754 +0000 UTC m=+324.271222483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c958p" (UniqueName: "kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p") pod "controller-manager-59574c768d-w7m8t" (UID: "36eeae0e-1131-4f44-a1ab-0a9cd785335d") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.500165 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" (UID: "ca0a5270-19fb-4d44-8fa5-24c0fd6eed32"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:19:14 crc kubenswrapper[4830]: I0311 09:19:14.589847 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca0a5270-19fb-4d44-8fa5-24c0fd6eed32-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.096555 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.096685 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ca0a5270-19fb-4d44-8fa5-24c0fd6eed32","Type":"ContainerDied","Data":"bfa0e8b9d7845a51678aca97078a788d6701f0d25778c6ee25ed651fb7c496a3"} Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.096918 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa0e8b9d7845a51678aca97078a788d6701f0d25778c6ee25ed651fb7c496a3" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.101416 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.101789 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.222842 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.224436 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.225962 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.226702 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.227595 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.412621 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.412682 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.412804 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.412835 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.412873 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.412882 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.413662 4830 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.413706 4830 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.413727 4830 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:15 crc kubenswrapper[4830]: E0311 09:19:15.891388 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: E0311 09:19:15.891984 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: E0311 09:19:15.892646 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: E0311 09:19:15.894119 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: E0311 09:19:15.894442 4830 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:15 crc kubenswrapper[4830]: I0311 09:19:15.894501 4830 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 11 09:19:15 crc kubenswrapper[4830]: E0311 09:19:15.894801 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="200ms" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.095342 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="400ms" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.105235 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.106586 4830 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e" exitCode=0 Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.106660 4830 scope.go:117] "RemoveContainer" containerID="a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.106862 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.126617 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.127389 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.127861 4830 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.134986 4830 scope.go:117] "RemoveContainer" containerID="95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.155508 4830 scope.go:117] "RemoveContainer" containerID="b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.225197 4830 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59574c768d-w7m8t.189bbee140de81ad openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59574c768d-w7m8t,UID:36eeae0e-1131-4f44-a1ab-0a9cd785335d,APIVersion:v1,ResourceVersion:29889,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-api-access-c958p\" : failed to fetch token: Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token\": dial tcp 38.102.83.169:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-11 09:19:12.965665197 +0000 UTC m=+320.746815886,LastTimestamp:2026-03-11 09:19:12.965665197 +0000 UTC m=+320.746815886,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.226509 4830 scope.go:117] "RemoveContainer" containerID="3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.244391 4830 scope.go:117] "RemoveContainer" containerID="747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.257082 4830 scope.go:117] "RemoveContainer" containerID="f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.285922 4830 scope.go:117] "RemoveContainer" containerID="a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.286417 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\": container with ID starting with a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2 not found: ID does not exist" containerID="a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.286453 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2"} err="failed to get container status \"a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\": rpc error: code = NotFound desc = could not find container \"a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2\": container with ID starting with a818e0a366405237bb5448619d929ca9b892f453b532375717f6372f34546ad2 not found: ID does not exist" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.286473 4830 scope.go:117] "RemoveContainer" containerID="95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.286739 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\": container with ID starting with 95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57 not found: ID does not exist" containerID="95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.286760 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57"} err="failed to get container status \"95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\": rpc error: code = NotFound desc = could not find container \"95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57\": container with ID starting with 95e89e078ab9b08f27d6c873855f1973259b8838fd8e21c8cce9e66ac4c08d57 not found: ID does not exist" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.286774 4830 scope.go:117] "RemoveContainer" containerID="b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.287225 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\": container with ID starting with b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f not found: ID does not exist" containerID="b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.287245 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f"} err="failed to get container status \"b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\": rpc error: code = NotFound desc = could not find container \"b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f\": container with ID starting with b78881843abe09daee07b30c10c3ab976b9e97c9b019218a4b8a9849a9fc262f not found: ID does not exist" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.287256 4830 scope.go:117] "RemoveContainer" containerID="3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.287562 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\": container with ID starting with 3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135 not found: ID does not exist" containerID="3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.287582 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135"} err="failed to get container status \"3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\": rpc error: code = NotFound desc = could not find container \"3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135\": container with ID starting with 3532ab76b0a52a082b32f09c7720814a0bda694edf0fe1f33989a5a5b6de0135 not found: ID does not exist" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.287593 4830 scope.go:117] "RemoveContainer" containerID="747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.287786 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\": container with ID starting with 747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e not found: ID does not exist" containerID="747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.287800 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e"} err="failed to get container status \"747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\": rpc error: code = NotFound desc = could not find container \"747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e\": container with ID starting with 747ec15bd44ceb78110d7275f408968ac4409247cc67ae409266016a2c781b9e not found: ID does not exist" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.287811 4830 scope.go:117] "RemoveContainer" containerID="f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.287979 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\": container with ID starting with f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f not found: ID does not exist" containerID="f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.287992 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f"} err="failed to get container status \"f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\": rpc error: code = NotFound desc = could not find container \"f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f\": container with ID starting with f8a85e7db9d30a889bececcdfa7dda36a9cc6125265b661c12b48103a358e76f not found: ID does not exist" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.495675 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="800ms" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.530565 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.530665 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.531001 4830 projected.go:194] Error preparing data for projected volume kube-api-access-spbtg for pod openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.531110 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg podName:ac658d02-afda-445f-9fe1-f276192398d4 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:20.531090733 +0000 UTC m=+328.312241422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-spbtg" (UniqueName: "kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg") pod "route-controller-manager-7fd449d484-7kbdc" (UID: "ac658d02-afda-445f-9fe1-f276192398d4") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.531228 4830 projected.go:194] Error preparing data for projected volume kube-api-access-c958p for pod openshift-controller-manager/controller-manager-59574c768d-w7m8t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:16 crc kubenswrapper[4830]: E0311 09:19:16.531319 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p podName:36eeae0e-1131-4f44-a1ab-0a9cd785335d nodeName:}" failed. No retries permitted until 2026-03-11 09:19:20.531298959 +0000 UTC m=+328.312449688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c958p" (UniqueName: "kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p") pod "controller-manager-59574c768d-w7m8t" (UID: "36eeae0e-1131-4f44-a1ab-0a9cd785335d") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:16 crc kubenswrapper[4830]: I0311 09:19:16.957106 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 11 09:19:17 crc kubenswrapper[4830]: E0311 09:19:17.297377 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="1.6s" Mar 11 09:19:18 crc kubenswrapper[4830]: E0311 09:19:18.898991 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="3.2s" Mar 11 09:19:20 crc kubenswrapper[4830]: I0311 09:19:20.594432 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:20 crc kubenswrapper[4830]: I0311 09:19:20.594917 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:20 crc kubenswrapper[4830]: E0311 09:19:20.596354 4830 projected.go:194] Error preparing data for projected volume kube-api-access-c958p for pod openshift-controller-manager/controller-manager-59574c768d-w7m8t: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:20 crc kubenswrapper[4830]: E0311 09:19:20.596549 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p podName:36eeae0e-1131-4f44-a1ab-0a9cd785335d nodeName:}" failed. No retries permitted until 2026-03-11 09:19:28.596484871 +0000 UTC m=+336.377635590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c958p" (UniqueName: "kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p") pod "controller-manager-59574c768d-w7m8t" (UID: "36eeae0e-1131-4f44-a1ab-0a9cd785335d") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:20 crc kubenswrapper[4830]: E0311 09:19:20.608090 4830 projected.go:194] Error preparing data for projected volume kube-api-access-spbtg for pod openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc: failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:20 crc kubenswrapper[4830]: E0311 09:19:20.608216 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg podName:ac658d02-afda-445f-9fe1-f276192398d4 nodeName:}" failed. No retries permitted until 2026-03-11 09:19:28.608183033 +0000 UTC m=+336.389333762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-spbtg" (UniqueName: "kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg") pod "route-controller-manager-7fd449d484-7kbdc" (UID: "ac658d02-afda-445f-9fe1-f276192398d4") : failed to fetch token: Post "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token": dial tcp 38.102.83.169:6443: connect: connection refused Mar 11 09:19:22 crc kubenswrapper[4830]: E0311 09:19:22.100419 4830 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="6.4s" Mar 11 09:19:22 crc kubenswrapper[4830]: I0311 09:19:22.934295 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:22 crc kubenswrapper[4830]: I0311 09:19:22.934703 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:23 crc kubenswrapper[4830]: I0311 09:19:23.932174 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:23 crc kubenswrapper[4830]: I0311 09:19:23.933648 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:23 crc kubenswrapper[4830]: I0311 09:19:23.934232 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:23 crc kubenswrapper[4830]: I0311 09:19:23.964208 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:23 crc kubenswrapper[4830]: I0311 09:19:23.964442 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:23 crc kubenswrapper[4830]: E0311 09:19:23.965208 4830 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:23 crc kubenswrapper[4830]: I0311 09:19:23.965912 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:24 crc kubenswrapper[4830]: I0311 09:19:24.166326 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"abb751afbe65c127ed0a01c44f4e4fcd1b7f2735a7a067ee5286cb52ce8e59ef"} Mar 11 09:19:25 crc kubenswrapper[4830]: I0311 09:19:25.178596 4830 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ed82e92bce828574582b86d94f5ed0cc702b827c0aed50590e7e374186e0063e" exitCode=0 Mar 11 09:19:25 crc kubenswrapper[4830]: I0311 09:19:25.178730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ed82e92bce828574582b86d94f5ed0cc702b827c0aed50590e7e374186e0063e"} Mar 11 09:19:25 crc kubenswrapper[4830]: I0311 09:19:25.179248 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:25 crc kubenswrapper[4830]: I0311 09:19:25.180563 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:25 crc kubenswrapper[4830]: I0311 09:19:25.179678 4830 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:25 crc kubenswrapper[4830]: I0311 09:19:25.181271 4830 status_manager.go:851] "Failed to get status for pod" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Mar 11 09:19:25 crc kubenswrapper[4830]: E0311 09:19:25.181863 4830 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:26 crc kubenswrapper[4830]: I0311 09:19:26.188046 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a7ba4ba370adf2b36b7a354b3183186b2b90eb6be9d5cd4166110e00196b1262"} Mar 11 09:19:26 crc kubenswrapper[4830]: I0311 09:19:26.188421 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f3d20e4a165061134f565184d7132d80a3dba820550acc0f279d73ddacfc6d0c"} Mar 11 09:19:26 crc kubenswrapper[4830]: I0311 09:19:26.188442 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3ec4d8842879e982d8aaec5572083707eec75687b6bd9729b72edba0b091c032"} Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.195840 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.197574 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.197623 4830 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac" exitCode=1 Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.197700 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac"} Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.198200 4830 scope.go:117] "RemoveContainer" containerID="e94fb0fc96e88aded69b01771506f1c933bfbd84be19ec3d5bd3a696fef27fac" Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.201113 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3f0b58e4dea2db880a6e63ee5f6191595ed5a6367f4ea91e03da29a2ea795ce7"} Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.201195 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cff11cc40e8b20feb9508da5c2efe50e813235885b75044ecff51e8d683c2b4b"} Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.201286 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.201357 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:27 crc kubenswrapper[4830]: I0311 09:19:27.201376 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.210417 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.212842 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.212903 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"451278c0f6740e9435af947e90f8eb6e459ae76bd53d2def5d13cb9a9aa2a78a"} Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.605777 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.632937 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c958p\" (UniqueName: \"kubernetes.io/projected/36eeae0e-1131-4f44-a1ab-0a9cd785335d-kube-api-access-c958p\") pod \"controller-manager-59574c768d-w7m8t\" (UID: \"36eeae0e-1131-4f44-a1ab-0a9cd785335d\") " pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.691063 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.707290 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.742852 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbtg\" (UniqueName: \"kubernetes.io/projected/ac658d02-afda-445f-9fe1-f276192398d4-kube-api-access-spbtg\") pod \"route-controller-manager-7fd449d484-7kbdc\" (UID: \"ac658d02-afda-445f-9fe1-f276192398d4\") " pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.966659 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.966882 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.967213 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.976777 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:28 crc kubenswrapper[4830]: I0311 09:19:28.979152 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:19:29 crc kubenswrapper[4830]: I0311 09:19:29.005372 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:29 crc kubenswrapper[4830]: W0311 09:19:29.033060 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36eeae0e_1131_4f44_a1ab_0a9cd785335d.slice/crio-23540b51856e23eb757b2df1d88d4be0cc4f178dc1c4bdd63ae63536a77bb01e WatchSource:0}: Error finding container 23540b51856e23eb757b2df1d88d4be0cc4f178dc1c4bdd63ae63536a77bb01e: Status 404 returned error can't find the container with id 23540b51856e23eb757b2df1d88d4be0cc4f178dc1c4bdd63ae63536a77bb01e Mar 11 09:19:29 crc kubenswrapper[4830]: I0311 09:19:29.219387 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" event={"ID":"36eeae0e-1131-4f44-a1ab-0a9cd785335d","Type":"ContainerStarted","Data":"784ac17c16ad708dbd1d69ba76c57fc9a34773a29ea42182d0cb246eba7466d1"} Mar 11 09:19:29 crc kubenswrapper[4830]: I0311 09:19:29.219812 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" event={"ID":"36eeae0e-1131-4f44-a1ab-0a9cd785335d","Type":"ContainerStarted","Data":"23540b51856e23eb757b2df1d88d4be0cc4f178dc1c4bdd63ae63536a77bb01e"} Mar 11 09:19:29 crc kubenswrapper[4830]: I0311 09:19:29.219847 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:19:29 crc kubenswrapper[4830]: I0311 09:19:29.219871 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:29 crc kubenswrapper[4830]: I0311 09:19:29.220622 4830 patch_prober.go:28] interesting pod/controller-manager-59574c768d-w7m8t container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 11 09:19:29 crc kubenswrapper[4830]: I0311 09:19:29.220673 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" podUID="36eeae0e-1131-4f44-a1ab-0a9cd785335d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.239915 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" event={"ID":"ac658d02-afda-445f-9fe1-f276192398d4","Type":"ContainerStarted","Data":"e361d63759000214441edbce98c9a5e5ea9b557904bb1cacdb475a66fb14b597"} Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.240348 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" event={"ID":"ac658d02-afda-445f-9fe1-f276192398d4","Type":"ContainerStarted","Data":"e4bc5d9ba1cfe01779cec5bc6800e1d69a0143ff5f818581e98505e70ac560cb"} Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.249680 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.943229 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.943320 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.943400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.943699 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.945938 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.946278 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.946606 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.955802 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.956940 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.962511 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.972358 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:19:30 crc kubenswrapper[4830]: I0311 09:19:30.973201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.173532 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.184458 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.193502 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.237459 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" podUID="ea88a306-e701-4a49-b4d2-7c4b62372c06" containerName="oauth-openshift" containerID="cri-o://4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b" gracePeriod=15 Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.249808 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.729244 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758481 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-login\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758526 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-session\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758558 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-cliconfig\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758627 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-idp-0-file-data\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758662 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-trusted-ca-bundle\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758688 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-policies\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758715 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-provider-selection\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758740 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg4bn\" (UniqueName: \"kubernetes.io/projected/ea88a306-e701-4a49-b4d2-7c4b62372c06-kube-api-access-bg4bn\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758763 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-serving-cert\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758792 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-error\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758817 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-dir\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758863 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-service-ca\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758887 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-ocp-branding-template\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.758932 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-router-certs\") pod \"ea88a306-e701-4a49-b4d2-7c4b62372c06\" (UID: \"ea88a306-e701-4a49-b4d2-7c4b62372c06\") " Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.762431 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.763259 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.764490 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.765149 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.765231 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.765282 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.765798 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.766001 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.767307 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.767666 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.768931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea88a306-e701-4a49-b4d2-7c4b62372c06-kube-api-access-bg4bn" (OuterVolumeSpecName: "kube-api-access-bg4bn") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "kube-api-access-bg4bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.769167 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.770163 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: W0311 09:19:31.772252 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-b3c964b0d45509f8a9568650fac6fecb13ea6b8731c7c0420eb8463543061ea0 WatchSource:0}: Error finding container b3c964b0d45509f8a9568650fac6fecb13ea6b8731c7c0420eb8463543061ea0: Status 404 returned error can't find the container with id b3c964b0d45509f8a9568650fac6fecb13ea6b8731c7c0420eb8463543061ea0 Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.772454 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ea88a306-e701-4a49-b4d2-7c4b62372c06" (UID: "ea88a306-e701-4a49-b4d2-7c4b62372c06"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860792 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860827 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860839 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860848 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860858 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860868 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860876 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860886 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860894 4830 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860904 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860914 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg4bn\" (UniqueName: \"kubernetes.io/projected/ea88a306-e701-4a49-b4d2-7c4b62372c06-kube-api-access-bg4bn\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860922 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860931 4830 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ea88a306-e701-4a49-b4d2-7c4b62372c06-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:31 crc kubenswrapper[4830]: I0311 09:19:31.860939 4830 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea88a306-e701-4a49-b4d2-7c4b62372c06-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.227979 4830 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.249431 4830 patch_prober.go:28] interesting pod/route-controller-manager-7fd449d484-7kbdc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.249496 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" podUID="ac658d02-afda-445f-9fe1-f276192398d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.255143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a1018da27388a4667a0c545cbfbe583eac770bbee15a5a102ad61157d424805c"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.255180 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"86902f3cb64e01aec7fc9623ceba42092762a72806a61599a47dd338ab044c72"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.256578 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8db705456c8aacf07ed791a4492e58d519d8231ecdf915616d7f9b75dcd07046"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.256615 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"558e103543cdb22cf3337c0a89117c9e6555862b711d2f6f155355a53401c9b1"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.258496 4830 generic.go:334] "Generic (PLEG): container finished" podID="ea88a306-e701-4a49-b4d2-7c4b62372c06" containerID="4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b" exitCode=0 Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.258550 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" event={"ID":"ea88a306-e701-4a49-b4d2-7c4b62372c06","Type":"ContainerDied","Data":"4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.258558 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.258578 4830 scope.go:117] "RemoveContainer" containerID="4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.258566 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-k86x2" event={"ID":"ea88a306-e701-4a49-b4d2-7c4b62372c06","Type":"ContainerDied","Data":"c8de23314e89a229d5ca30f7c047ea2763189c7634e3c5d9ceccb8229ad6d0db"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.260082 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8fb8d8f969e1146c8d92728078b56625b031e46ebd774c2d2c22364bd50d64c2"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.260123 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b3c964b0d45509f8a9568650fac6fecb13ea6b8731c7c0420eb8463543061ea0"} Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.260302 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.260317 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.264921 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.278685 4830 scope.go:117] "RemoveContainer" containerID="4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b" Mar 11 09:19:32 crc kubenswrapper[4830]: E0311 09:19:32.279364 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b\": container with ID starting with 4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b not found: ID does not exist" containerID="4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.279419 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b"} err="failed to get container status \"4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b\": rpc error: code = NotFound desc = could not find container \"4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b\": container with ID starting with 4ec27a94117c942abd3974993fbfc8559765881781df5a31ae0e076d1a76684b not found: ID does not exist" Mar 11 09:19:32 crc kubenswrapper[4830]: I0311 09:19:32.948057 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7e1da589-bfef-402d-b405-9d77cec2d5f0" Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.260652 4830 patch_prober.go:28] interesting pod/route-controller-manager-7fd449d484-7kbdc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.260760 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" podUID="ac658d02-afda-445f-9fe1-f276192398d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.271844 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.271933 4830 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="8db705456c8aacf07ed791a4492e58d519d8231ecdf915616d7f9b75dcd07046" exitCode=255 Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.272102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"8db705456c8aacf07ed791a4492e58d519d8231ecdf915616d7f9b75dcd07046"} Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.273275 4830 scope.go:117] "RemoveContainer" containerID="8db705456c8aacf07ed791a4492e58d519d8231ecdf915616d7f9b75dcd07046" Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.274936 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.275390 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:33 crc kubenswrapper[4830]: I0311 09:19:33.293996 4830 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7e1da589-bfef-402d-b405-9d77cec2d5f0" Mar 11 09:19:34 crc kubenswrapper[4830]: I0311 09:19:34.288320 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 09:19:34 crc kubenswrapper[4830]: I0311 09:19:34.289232 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 11 09:19:34 crc kubenswrapper[4830]: I0311 09:19:34.289312 4830 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="d11a5504ee2ad256167d25801d4445f10b0d2e6d443ec0f06ce39deb09139bcc" exitCode=255 Mar 11 09:19:34 crc kubenswrapper[4830]: I0311 09:19:34.289360 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"d11a5504ee2ad256167d25801d4445f10b0d2e6d443ec0f06ce39deb09139bcc"} Mar 11 09:19:34 crc kubenswrapper[4830]: I0311 09:19:34.289409 4830 scope.go:117] "RemoveContainer" containerID="8db705456c8aacf07ed791a4492e58d519d8231ecdf915616d7f9b75dcd07046" Mar 11 09:19:34 crc kubenswrapper[4830]: I0311 09:19:34.290461 4830 scope.go:117] "RemoveContainer" containerID="d11a5504ee2ad256167d25801d4445f10b0d2e6d443ec0f06ce39deb09139bcc" Mar 11 09:19:34 crc kubenswrapper[4830]: E0311 09:19:34.291063 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:19:35 crc kubenswrapper[4830]: I0311 09:19:35.301465 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 09:19:40 crc kubenswrapper[4830]: I0311 09:19:40.006357 4830 patch_prober.go:28] interesting pod/route-controller-manager-7fd449d484-7kbdc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:19:40 crc kubenswrapper[4830]: I0311 09:19:40.006891 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" podUID="ac658d02-afda-445f-9fe1-f276192398d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:19:41 crc kubenswrapper[4830]: I0311 09:19:41.184912 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:19:42 crc kubenswrapper[4830]: I0311 09:19:42.111248 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 11 09:19:42 crc kubenswrapper[4830]: I0311 09:19:42.529843 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 11 09:19:42 crc kubenswrapper[4830]: I0311 09:19:42.837943 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 11 09:19:42 crc kubenswrapper[4830]: I0311 09:19:42.969856 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 11 09:19:43 crc kubenswrapper[4830]: I0311 09:19:43.013058 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 11 09:19:43 crc kubenswrapper[4830]: I0311 09:19:43.359073 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 11 09:19:43 crc kubenswrapper[4830]: I0311 09:19:43.736308 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 11 09:19:43 crc kubenswrapper[4830]: I0311 09:19:43.860857 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 11 09:19:44 crc kubenswrapper[4830]: I0311 09:19:44.340331 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 11 09:19:44 crc kubenswrapper[4830]: I0311 09:19:44.392848 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 11 09:19:44 crc kubenswrapper[4830]: I0311 09:19:44.432045 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 11 09:19:44 crc kubenswrapper[4830]: I0311 09:19:44.516936 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 11 09:19:44 crc kubenswrapper[4830]: I0311 09:19:44.690091 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 11 09:19:44 crc kubenswrapper[4830]: I0311 09:19:44.756359 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 11 09:19:44 crc kubenswrapper[4830]: I0311 09:19:44.932810 4830 scope.go:117] "RemoveContainer" containerID="d11a5504ee2ad256167d25801d4445f10b0d2e6d443ec0f06ce39deb09139bcc" Mar 11 09:19:45 crc kubenswrapper[4830]: I0311 09:19:45.229912 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 11 09:19:45 crc kubenswrapper[4830]: I0311 09:19:45.251080 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 11 09:19:45 crc kubenswrapper[4830]: I0311 09:19:45.403351 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 09:19:45 crc kubenswrapper[4830]: I0311 09:19:45.403438 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e9b7cd31a068da2ca96e2bf3703a37b3341161faf4a36cf206dedd093ca7f990"} Mar 11 09:19:45 crc kubenswrapper[4830]: I0311 09:19:45.537979 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.065688 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.134988 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.270745 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.413500 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.414369 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.414406 4830 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="e9b7cd31a068da2ca96e2bf3703a37b3341161faf4a36cf206dedd093ca7f990" exitCode=255 Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.414431 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"e9b7cd31a068da2ca96e2bf3703a37b3341161faf4a36cf206dedd093ca7f990"} Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.414462 4830 scope.go:117] "RemoveContainer" containerID="d11a5504ee2ad256167d25801d4445f10b0d2e6d443ec0f06ce39deb09139bcc" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.415117 4830 scope.go:117] "RemoveContainer" containerID="e9b7cd31a068da2ca96e2bf3703a37b3341161faf4a36cf206dedd093ca7f990" Mar 11 09:19:46 crc kubenswrapper[4830]: E0311 09:19:46.415550 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.483960 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.491428 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.530833 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.743327 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 11 09:19:46 crc kubenswrapper[4830]: I0311 09:19:46.969836 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.076243 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.101869 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.204972 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.223993 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.409717 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.410054 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.424132 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.424718 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.429932 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.645189 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.816196 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.836301 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.932627 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 11 09:19:47 crc kubenswrapper[4830]: I0311 09:19:47.943108 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.035513 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.079564 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.095333 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.098149 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.195624 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.203237 4830 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.210173 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" podStartSLOduration=38.210147002 podStartE2EDuration="38.210147002s" podCreationTimestamp="2026-03-11 09:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:19:32.334356485 +0000 UTC m=+340.115507224" watchObservedRunningTime="2026-03-11 09:19:48.210147002 +0000 UTC m=+355.991297751" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.211434 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.211421106 podStartE2EDuration="36.211421106s" podCreationTimestamp="2026-03-11 09:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:19:32.274702473 +0000 UTC m=+340.055853162" watchObservedRunningTime="2026-03-11 09:19:48.211421106 +0000 UTC m=+355.992571825" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.214186 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59574c768d-w7m8t" podStartSLOduration=38.21417004 podStartE2EDuration="38.21417004s" podCreationTimestamp="2026-03-11 09:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:19:32.39926766 +0000 UTC m=+340.180418369" watchObservedRunningTime="2026-03-11 09:19:48.21417004 +0000 UTC m=+355.995320759" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.215559 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-k86x2"] Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.215826 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.216067 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc","openshift-controller-manager/controller-manager-59574c768d-w7m8t"] Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.216571 4830 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.216622 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a7bc2420-5f7a-4113-84ab-58279da87f62" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.224205 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.255729 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.255698988 podStartE2EDuration="16.255698988s" podCreationTimestamp="2026-03-11 09:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:19:48.246317178 +0000 UTC m=+356.027467947" watchObservedRunningTime="2026-03-11 09:19:48.255698988 +0000 UTC m=+356.036849727" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.269174 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.340178 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.419757 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.533672 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.689580 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.777172 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.818774 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.828793 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 11 09:19:48 crc kubenswrapper[4830]: I0311 09:19:48.957220 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea88a306-e701-4a49-b4d2-7c4b62372c06" path="/var/lib/kubelet/pods/ea88a306-e701-4a49-b4d2-7c4b62372c06/volumes" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.020663 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.028278 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.076080 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.092717 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.167306 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.217540 4830 patch_prober.go:28] interesting pod/route-controller-manager-7fd449d484-7kbdc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.217860 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" podUID="ac658d02-afda-445f-9fe1-f276192398d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.254529 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.261767 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.319706 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.342805 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.359134 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.404920 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.435406 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.450461 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.512882 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.543901 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.626615 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.671857 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.744010 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.803991 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.806053 4830 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.820496 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.847112 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.863347 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 11 09:19:49 crc kubenswrapper[4830]: I0311 09:19:49.919684 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.011323 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.070657 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.092392 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.178939 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.218959 4830 patch_prober.go:28] interesting pod/route-controller-manager-7fd449d484-7kbdc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.219076 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" podUID="ac658d02-afda-445f-9fe1-f276192398d4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.331778 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.340639 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.376110 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.408228 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.506998 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.534263 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.572156 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.668161 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.688891 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.690010 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.697551 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.717105 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.742089 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.744928 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.820145 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.835279 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.836104 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.856957 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.898556 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.902741 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 11 09:19:50 crc kubenswrapper[4830]: I0311 09:19:50.958415 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.081267 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.163293 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.204433 4830 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.212457 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.218974 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.339594 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.501989 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.514784 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.607576 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.634597 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.790547 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.843294 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.912891 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.920117 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 11 09:19:51 crc kubenswrapper[4830]: I0311 09:19:51.987228 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.001564 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.016825 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.026873 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.052530 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.093204 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.281905 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.301792 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.480745 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.481284 4830 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.544808 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.586035 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.594602 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.631103 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.656860 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.711385 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.975116 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 11 09:19:52 crc kubenswrapper[4830]: I0311 09:19:52.989651 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.069657 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.157436 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.165227 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.172335 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.305828 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.306975 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.434287 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.485375 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.590132 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.598237 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.619335 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.658769 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.670149 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.717212 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.751643 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.816838 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.855409 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.945001 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 11 09:19:53 crc kubenswrapper[4830]: I0311 09:19:53.969871 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.026288 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.143681 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.285105 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.330676 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.420846 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.479963 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.501659 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.558291 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.559710 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.631180 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.644458 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.652126 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.722896 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.723060 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.766738 4830 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.767067 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10" gracePeriod=5 Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.808348 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.888478 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.938130 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 11 09:19:54 crc kubenswrapper[4830]: I0311 09:19:54.949694 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.086928 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.228077 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.232307 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.247242 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.251834 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.316506 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.369409 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.404757 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.456281 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.531725 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.534054 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.623137 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.658005 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.673204 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.677417 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.679003 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.785564 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 11 09:19:55 crc kubenswrapper[4830]: I0311 09:19:55.827178 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.124118 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.248690 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.291472 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.359748 4830 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.382695 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.383434 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.415753 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.517937 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.524376 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.538354 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.575364 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.623210 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.647550 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.777749 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 11 09:19:56 crc kubenswrapper[4830]: I0311 09:19:56.988091 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 11 09:19:57 crc kubenswrapper[4830]: I0311 09:19:57.199979 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 11 09:19:57 crc kubenswrapper[4830]: I0311 09:19:57.301739 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 11 09:19:57 crc kubenswrapper[4830]: I0311 09:19:57.515487 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 11 09:19:57 crc kubenswrapper[4830]: I0311 09:19:57.517276 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 11 09:19:57 crc kubenswrapper[4830]: I0311 09:19:57.611414 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 11 09:19:57 crc kubenswrapper[4830]: I0311 09:19:57.763975 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 11 09:19:58 crc kubenswrapper[4830]: I0311 09:19:58.097300 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 11 09:19:58 crc kubenswrapper[4830]: I0311 09:19:58.396389 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 11 09:19:58 crc kubenswrapper[4830]: I0311 09:19:58.406323 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 11 09:19:58 crc kubenswrapper[4830]: I0311 09:19:58.624775 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 11 09:19:58 crc kubenswrapper[4830]: I0311 09:19:58.641450 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 11 09:19:58 crc kubenswrapper[4830]: I0311 09:19:58.862485 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 11 09:19:58 crc kubenswrapper[4830]: I0311 09:19:58.905114 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.029978 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fd449d484-7kbdc" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.144987 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.184355 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.215656 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.262800 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.297801 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.477711 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.724895 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.862472 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.897193 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.932652 4830 scope.go:117] "RemoveContainer" containerID="e9b7cd31a068da2ca96e2bf3703a37b3341161faf4a36cf206dedd093ca7f990" Mar 11 09:19:59 crc kubenswrapper[4830]: E0311 09:19:59.933099 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 11 09:19:59 crc kubenswrapper[4830]: I0311 09:19:59.973279 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.101082 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.171821 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553680-knd42"] Mar 11 09:20:00 crc kubenswrapper[4830]: E0311 09:20:00.172306 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea88a306-e701-4a49-b4d2-7c4b62372c06" containerName="oauth-openshift" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.172505 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea88a306-e701-4a49-b4d2-7c4b62372c06" containerName="oauth-openshift" Mar 11 09:20:00 crc kubenswrapper[4830]: E0311 09:20:00.172616 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.172750 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 09:20:00 crc kubenswrapper[4830]: E0311 09:20:00.172840 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" containerName="installer" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.172923 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" containerName="installer" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.173163 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.173285 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea88a306-e701-4a49-b4d2-7c4b62372c06" containerName="oauth-openshift" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.173378 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca0a5270-19fb-4d44-8fa5-24c0fd6eed32" containerName="installer" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.173844 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553680-knd42" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.176561 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.176940 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.177430 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.184040 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553680-knd42"] Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.225647 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xb56\" (UniqueName: \"kubernetes.io/projected/5bd5fce1-cc80-4c14-92fc-9c220e5ffa93-kube-api-access-5xb56\") pod \"auto-csr-approver-29553680-knd42\" (UID: \"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93\") " pod="openshift-infra/auto-csr-approver-29553680-knd42" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.327136 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xb56\" (UniqueName: \"kubernetes.io/projected/5bd5fce1-cc80-4c14-92fc-9c220e5ffa93-kube-api-access-5xb56\") pod \"auto-csr-approver-29553680-knd42\" (UID: \"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93\") " pod="openshift-infra/auto-csr-approver-29553680-knd42" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.347822 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xb56\" (UniqueName: \"kubernetes.io/projected/5bd5fce1-cc80-4c14-92fc-9c220e5ffa93-kube-api-access-5xb56\") pod \"auto-csr-approver-29553680-knd42\" (UID: \"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93\") " pod="openshift-infra/auto-csr-approver-29553680-knd42" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.350764 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.350862 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.398292 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.427958 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428119 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428222 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428273 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428268 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428326 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428383 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428430 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428526 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428859 4830 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428917 4830 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428943 4830 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.428969 4830 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.434242 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.500458 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553680-knd42" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.529990 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.530335 4830 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.569086 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.569390 4830 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10" exitCode=137 Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.569458 4830 scope.go:117] "RemoveContainer" containerID="39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.569476 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.590950 4830 scope.go:117] "RemoveContainer" containerID="39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10" Mar 11 09:20:00 crc kubenswrapper[4830]: E0311 09:20:00.591502 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10\": container with ID starting with 39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10 not found: ID does not exist" containerID="39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.591528 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10"} err="failed to get container status \"39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10\": rpc error: code = NotFound desc = could not find container \"39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10\": container with ID starting with 39da290ebaa5e153c5db5ea69882620e7f2a7278bba3e98bcf438f749a594a10 not found: ID does not exist" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.615384 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.714651 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553680-knd42"] Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.775244 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-c5d49d659-7zhsj"] Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.776821 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.782657 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.782914 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.783335 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.784357 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.784512 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.785306 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.785434 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.785587 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.785772 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.786081 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.786278 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.786486 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.791360 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.793128 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c5d49d659-7zhsj"] Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.800709 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.804152 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837587 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837646 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837689 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837719 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837744 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl6md\" (UniqueName: \"kubernetes.io/projected/5c587d05-3597-4434-9c51-0c8ce8fd92ef-kube-api-access-sl6md\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837775 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-error\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837857 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837924 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.837982 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-audit-policies\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.838053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-login\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.838083 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c587d05-3597-4434-9c51-0c8ce8fd92ef-audit-dir\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.838141 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.838164 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-session\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.838218 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.885396 4830 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.939079 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.939202 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl6md\" (UniqueName: \"kubernetes.io/projected/5c587d05-3597-4434-9c51-0c8ce8fd92ef-kube-api-access-sl6md\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.939745 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-error\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.939847 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.939956 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.940008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-audit-policies\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.940109 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-login\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.941646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.941741 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c587d05-3597-4434-9c51-0c8ce8fd92ef-audit-dir\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.941782 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.941818 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-session\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.941848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.941893 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.941950 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.942000 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.942340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-audit-policies\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.943738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.943847 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.943923 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5c587d05-3597-4434-9c51-0c8ce8fd92ef-audit-dir\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.944059 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.944100 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.944243 4830 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.944945 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.945233 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.948128 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.949032 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.953819 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-error\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.956073 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-system-session\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.957964 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5c587d05-3597-4434-9c51-0c8ce8fd92ef-v4-0-config-user-template-login\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.960201 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl6md\" (UniqueName: \"kubernetes.io/projected/5c587d05-3597-4434-9c51-0c8ce8fd92ef-kube-api-access-sl6md\") pod \"oauth-openshift-c5d49d659-7zhsj\" (UID: \"5c587d05-3597-4434-9c51-0c8ce8fd92ef\") " pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.960297 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.960313 4830 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d9150bbf-3761-4a01-926a-d2e010aa1cce" Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.966542 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 11 09:20:00 crc kubenswrapper[4830]: I0311 09:20:00.966585 4830 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="d9150bbf-3761-4a01-926a-d2e010aa1cce" Mar 11 09:20:01 crc kubenswrapper[4830]: I0311 09:20:01.118650 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:01 crc kubenswrapper[4830]: I0311 09:20:01.145553 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 11 09:20:01 crc kubenswrapper[4830]: I0311 09:20:01.192726 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 11 09:20:01 crc kubenswrapper[4830]: I0311 09:20:01.576822 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553680-knd42" event={"ID":"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93","Type":"ContainerStarted","Data":"f314e8d43b672644ba62331d85ac9f1e668aebb08e676fc024713dd22bcfebc9"} Mar 11 09:20:01 crc kubenswrapper[4830]: I0311 09:20:01.629828 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c5d49d659-7zhsj"] Mar 11 09:20:01 crc kubenswrapper[4830]: W0311 09:20:01.637906 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c587d05_3597_4434_9c51_0c8ce8fd92ef.slice/crio-c124f69036e3f9c15c66647a5e0b79375f42c411f163b77c41e3bea7b7aa371f WatchSource:0}: Error finding container c124f69036e3f9c15c66647a5e0b79375f42c411f163b77c41e3bea7b7aa371f: Status 404 returned error can't find the container with id c124f69036e3f9c15c66647a5e0b79375f42c411f163b77c41e3bea7b7aa371f Mar 11 09:20:01 crc kubenswrapper[4830]: I0311 09:20:01.791205 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 11 09:20:01 crc kubenswrapper[4830]: I0311 09:20:01.856109 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 11 09:20:02 crc kubenswrapper[4830]: I0311 09:20:02.582698 4830 generic.go:334] "Generic (PLEG): container finished" podID="5bd5fce1-cc80-4c14-92fc-9c220e5ffa93" containerID="9d0b0b705c635201e241b6ade6a3bd1f15fd636e2dd493fa96ff88d6f43eb60d" exitCode=0 Mar 11 09:20:02 crc kubenswrapper[4830]: I0311 09:20:02.582798 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553680-knd42" event={"ID":"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93","Type":"ContainerDied","Data":"9d0b0b705c635201e241b6ade6a3bd1f15fd636e2dd493fa96ff88d6f43eb60d"} Mar 11 09:20:02 crc kubenswrapper[4830]: I0311 09:20:02.584141 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" event={"ID":"5c587d05-3597-4434-9c51-0c8ce8fd92ef","Type":"ContainerStarted","Data":"9b6d6da267d72a1cd8394f16eb500a09eb19e787c45911130bd83e93d42d3d29"} Mar 11 09:20:02 crc kubenswrapper[4830]: I0311 09:20:02.584192 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" event={"ID":"5c587d05-3597-4434-9c51-0c8ce8fd92ef","Type":"ContainerStarted","Data":"c124f69036e3f9c15c66647a5e0b79375f42c411f163b77c41e3bea7b7aa371f"} Mar 11 09:20:02 crc kubenswrapper[4830]: I0311 09:20:02.584383 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:02 crc kubenswrapper[4830]: I0311 09:20:02.640866 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" podStartSLOduration=56.640849463 podStartE2EDuration="56.640849463s" podCreationTimestamp="2026-03-11 09:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:20:02.637925545 +0000 UTC m=+370.419076264" watchObservedRunningTime="2026-03-11 09:20:02.640849463 +0000 UTC m=+370.422000162" Mar 11 09:20:02 crc kubenswrapper[4830]: I0311 09:20:02.677551 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-c5d49d659-7zhsj" Mar 11 09:20:03 crc kubenswrapper[4830]: I0311 09:20:03.948884 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553680-knd42" Mar 11 09:20:04 crc kubenswrapper[4830]: I0311 09:20:04.007497 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xb56\" (UniqueName: \"kubernetes.io/projected/5bd5fce1-cc80-4c14-92fc-9c220e5ffa93-kube-api-access-5xb56\") pod \"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93\" (UID: \"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93\") " Mar 11 09:20:04 crc kubenswrapper[4830]: I0311 09:20:04.016562 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd5fce1-cc80-4c14-92fc-9c220e5ffa93-kube-api-access-5xb56" (OuterVolumeSpecName: "kube-api-access-5xb56") pod "5bd5fce1-cc80-4c14-92fc-9c220e5ffa93" (UID: "5bd5fce1-cc80-4c14-92fc-9c220e5ffa93"). InnerVolumeSpecName "kube-api-access-5xb56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:20:04 crc kubenswrapper[4830]: I0311 09:20:04.109441 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xb56\" (UniqueName: \"kubernetes.io/projected/5bd5fce1-cc80-4c14-92fc-9c220e5ffa93-kube-api-access-5xb56\") on node \"crc\" DevicePath \"\"" Mar 11 09:20:04 crc kubenswrapper[4830]: I0311 09:20:04.609894 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553680-knd42" event={"ID":"5bd5fce1-cc80-4c14-92fc-9c220e5ffa93","Type":"ContainerDied","Data":"f314e8d43b672644ba62331d85ac9f1e668aebb08e676fc024713dd22bcfebc9"} Mar 11 09:20:04 crc kubenswrapper[4830]: I0311 09:20:04.609941 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f314e8d43b672644ba62331d85ac9f1e668aebb08e676fc024713dd22bcfebc9" Mar 11 09:20:04 crc kubenswrapper[4830]: I0311 09:20:04.609984 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553680-knd42" Mar 11 09:20:13 crc kubenswrapper[4830]: I0311 09:20:13.933071 4830 scope.go:117] "RemoveContainer" containerID="e9b7cd31a068da2ca96e2bf3703a37b3341161faf4a36cf206dedd093ca7f990" Mar 11 09:20:14 crc kubenswrapper[4830]: I0311 09:20:14.661738 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 11 09:20:14 crc kubenswrapper[4830]: I0311 09:20:14.662394 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"983f469b10785f203f717ed2fd68db96b90862247df90aa504e8ae11f608772a"} Mar 11 09:20:22 crc kubenswrapper[4830]: I0311 09:20:22.719496 4830 generic.go:334] "Generic (PLEG): container finished" podID="c61d535e-afb5-4006-a758-8bba8735a860" containerID="e908d7bcbb7cc0c0a321bddd4f7137df0885dfd2db6cf98d4f3fc7e53b9125da" exitCode=0 Mar 11 09:20:22 crc kubenswrapper[4830]: I0311 09:20:22.719630 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" event={"ID":"c61d535e-afb5-4006-a758-8bba8735a860","Type":"ContainerDied","Data":"e908d7bcbb7cc0c0a321bddd4f7137df0885dfd2db6cf98d4f3fc7e53b9125da"} Mar 11 09:20:22 crc kubenswrapper[4830]: I0311 09:20:22.720962 4830 scope.go:117] "RemoveContainer" containerID="e908d7bcbb7cc0c0a321bddd4f7137df0885dfd2db6cf98d4f3fc7e53b9125da" Mar 11 09:20:23 crc kubenswrapper[4830]: I0311 09:20:23.729863 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" event={"ID":"c61d535e-afb5-4006-a758-8bba8735a860","Type":"ContainerStarted","Data":"742a30f6c6d64fe69e0bd3b024a5622c43a61519086e37cef7c11ca146eed70f"} Mar 11 09:20:23 crc kubenswrapper[4830]: I0311 09:20:23.730652 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:20:23 crc kubenswrapper[4830]: I0311 09:20:23.733413 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.060276 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bc8c8"] Mar 11 09:20:46 crc kubenswrapper[4830]: E0311 09:20:46.060918 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd5fce1-cc80-4c14-92fc-9c220e5ffa93" containerName="oc" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.060931 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd5fce1-cc80-4c14-92fc-9c220e5ffa93" containerName="oc" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.061029 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd5fce1-cc80-4c14-92fc-9c220e5ffa93" containerName="oc" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.061398 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.075398 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bc8c8"] Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7eb2fc63-2d6e-4aae-b333-201e9e3be708-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203119 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-registry-tls\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-bound-sa-token\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203219 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnf7m\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-kube-api-access-pnf7m\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203244 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7eb2fc63-2d6e-4aae-b333-201e9e3be708-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203263 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7eb2fc63-2d6e-4aae-b333-201e9e3be708-registry-certificates\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203313 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.203339 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7eb2fc63-2d6e-4aae-b333-201e9e3be708-trusted-ca\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.230788 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.304178 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7eb2fc63-2d6e-4aae-b333-201e9e3be708-trusted-ca\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.304248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7eb2fc63-2d6e-4aae-b333-201e9e3be708-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.304301 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-registry-tls\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.304333 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-bound-sa-token\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.304389 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnf7m\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-kube-api-access-pnf7m\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.304423 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7eb2fc63-2d6e-4aae-b333-201e9e3be708-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.304447 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7eb2fc63-2d6e-4aae-b333-201e9e3be708-registry-certificates\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.305885 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7eb2fc63-2d6e-4aae-b333-201e9e3be708-trusted-ca\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.305891 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7eb2fc63-2d6e-4aae-b333-201e9e3be708-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.306101 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7eb2fc63-2d6e-4aae-b333-201e9e3be708-registry-certificates\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.311437 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7eb2fc63-2d6e-4aae-b333-201e9e3be708-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.314512 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-registry-tls\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.326329 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-bound-sa-token\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.333112 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnf7m\" (UniqueName: \"kubernetes.io/projected/7eb2fc63-2d6e-4aae-b333-201e9e3be708-kube-api-access-pnf7m\") pod \"image-registry-66df7c8f76-bc8c8\" (UID: \"7eb2fc63-2d6e-4aae-b333-201e9e3be708\") " pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.382999 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.867346 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bc8c8"] Mar 11 09:20:46 crc kubenswrapper[4830]: I0311 09:20:46.895807 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" event={"ID":"7eb2fc63-2d6e-4aae-b333-201e9e3be708","Type":"ContainerStarted","Data":"9b40a83095ccf2046e625b6922c6799022959059d8baa9df2eedba54b60b8664"} Mar 11 09:20:47 crc kubenswrapper[4830]: I0311 09:20:47.902846 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" event={"ID":"7eb2fc63-2d6e-4aae-b333-201e9e3be708","Type":"ContainerStarted","Data":"96633914a38172a7dab1eee8ae8aafb71139fcd98a532d5f35d1fd8d287ec581"} Mar 11 09:20:47 crc kubenswrapper[4830]: I0311 09:20:47.903152 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:20:47 crc kubenswrapper[4830]: I0311 09:20:47.924164 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" podStartSLOduration=1.924132709 podStartE2EDuration="1.924132709s" podCreationTimestamp="2026-03-11 09:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:20:47.919471833 +0000 UTC m=+415.700622532" watchObservedRunningTime="2026-03-11 09:20:47.924132709 +0000 UTC m=+415.705283398" Mar 11 09:21:06 crc kubenswrapper[4830]: I0311 09:21:06.389308 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bc8c8" Mar 11 09:21:06 crc kubenswrapper[4830]: I0311 09:21:06.440447 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dpw5"] Mar 11 09:21:13 crc kubenswrapper[4830]: I0311 09:21:13.060380 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:21:13 crc kubenswrapper[4830]: I0311 09:21:13.060889 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:21:31 crc kubenswrapper[4830]: I0311 09:21:31.489509 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" podUID="f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" containerName="registry" containerID="cri-o://f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1" gracePeriod=30 Mar 11 09:21:31 crc kubenswrapper[4830]: I0311 09:21:31.888230 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031184 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-tls\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031241 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-ca-trust-extracted\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031277 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-certificates\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031406 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031461 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-trusted-ca\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031520 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-installation-pull-secrets\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031545 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-bound-sa-token\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.031579 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwnq2\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-kube-api-access-gwnq2\") pod \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\" (UID: \"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f\") " Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.034187 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.034291 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.055862 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.056222 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.056392 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.056445 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.056716 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-kube-api-access-gwnq2" (OuterVolumeSpecName: "kube-api-access-gwnq2") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "kube-api-access-gwnq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.059389 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" (UID: "f6a7090b-02e3-4cf4-a380-4afcd01ceb1f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.133403 4830 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.133447 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwnq2\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-kube-api-access-gwnq2\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.133463 4830 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.133477 4830 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.133490 4830 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.133501 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.133513 4830 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.159684 4830 generic.go:334] "Generic (PLEG): container finished" podID="f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" containerID="f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1" exitCode=0 Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.159895 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" event={"ID":"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f","Type":"ContainerDied","Data":"f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1"} Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.160149 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" event={"ID":"f6a7090b-02e3-4cf4-a380-4afcd01ceb1f","Type":"ContainerDied","Data":"692cd07551535b6941e04158ff6a512d797a544d425a32548a370154fdf2ae46"} Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.160286 4830 scope.go:117] "RemoveContainer" containerID="f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.159996 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8dpw5" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.187879 4830 scope.go:117] "RemoveContainer" containerID="f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1" Mar 11 09:21:32 crc kubenswrapper[4830]: E0311 09:21:32.188898 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1\": container with ID starting with f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1 not found: ID does not exist" containerID="f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.189101 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1"} err="failed to get container status \"f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1\": rpc error: code = NotFound desc = could not find container \"f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1\": container with ID starting with f3bf926057ad9683e1059cc087250738588de502d405bd5eee6c72dc119203d1 not found: ID does not exist" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.197576 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dpw5"] Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.203063 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8dpw5"] Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.947785 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" path="/var/lib/kubelet/pods/f6a7090b-02e3-4cf4-a380-4afcd01ceb1f/volumes" Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.972816 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x9zpp"] Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.973069 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x9zpp" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="registry-server" containerID="cri-o://4fabe6c2638348f3f715899a472a45c820512f348470217255228292ad020a41" gracePeriod=30 Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.980559 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6st5"] Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.980938 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m6st5" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="registry-server" containerID="cri-o://9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274" gracePeriod=30 Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.991749 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pplbq"] Mar 11 09:21:32 crc kubenswrapper[4830]: I0311 09:21:32.991995 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" containerID="cri-o://742a30f6c6d64fe69e0bd3b024a5622c43a61519086e37cef7c11ca146eed70f" gracePeriod=30 Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.006367 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qgv"] Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.006743 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5qgv" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="registry-server" containerID="cri-o://81df5ae2df36edb1b9c15a9635b190d74b30a0827b8307a98022c6328c0c3a87" gracePeriod=30 Mar 11 09:21:33 crc kubenswrapper[4830]: E0311 09:21:33.014119 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274" cmd=["grpc_health_probe","-addr=:50051"] Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.018848 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ctcw6"] Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.019207 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ctcw6" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="registry-server" containerID="cri-o://63082d06d754858f7393926be38640a7952c235118ca7c7524553c8472b30a58" gracePeriod=30 Mar 11 09:21:33 crc kubenswrapper[4830]: E0311 09:21:33.020112 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274 is running failed: container process not found" containerID="9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274" cmd=["grpc_health_probe","-addr=:50051"] Mar 11 09:21:33 crc kubenswrapper[4830]: E0311 09:21:33.020459 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274 is running failed: container process not found" containerID="9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274" cmd=["grpc_health_probe","-addr=:50051"] Mar 11 09:21:33 crc kubenswrapper[4830]: E0311 09:21:33.020511 4830 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-m6st5" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="registry-server" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.031413 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjsls"] Mar 11 09:21:33 crc kubenswrapper[4830]: E0311 09:21:33.031679 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" containerName="registry" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.031693 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" containerName="registry" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.031818 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a7090b-02e3-4cf4-a380-4afcd01ceb1f" containerName="registry" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.032309 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.035153 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjsls"] Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.145777 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74cca8dd-8cb4-41db-9626-9612877ad60e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.145826 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74cca8dd-8cb4-41db-9626-9612877ad60e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.145862 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9z7t\" (UniqueName: \"kubernetes.io/projected/74cca8dd-8cb4-41db-9626-9612877ad60e-kube-api-access-q9z7t\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.175499 4830 generic.go:334] "Generic (PLEG): container finished" podID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerID="81df5ae2df36edb1b9c15a9635b190d74b30a0827b8307a98022c6328c0c3a87" exitCode=0 Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.175587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qgv" event={"ID":"44447137-e7c5-4a07-bcdd-6bcc4c835f79","Type":"ContainerDied","Data":"81df5ae2df36edb1b9c15a9635b190d74b30a0827b8307a98022c6328c0c3a87"} Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.184240 4830 generic.go:334] "Generic (PLEG): container finished" podID="7febf059-370d-4a68-a543-3b23879ba479" containerID="4fabe6c2638348f3f715899a472a45c820512f348470217255228292ad020a41" exitCode=0 Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.184337 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9zpp" event={"ID":"7febf059-370d-4a68-a543-3b23879ba479","Type":"ContainerDied","Data":"4fabe6c2638348f3f715899a472a45c820512f348470217255228292ad020a41"} Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.187454 4830 generic.go:334] "Generic (PLEG): container finished" podID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerID="63082d06d754858f7393926be38640a7952c235118ca7c7524553c8472b30a58" exitCode=0 Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.187522 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ctcw6" event={"ID":"2438f79c-45d2-4b4f-951b-630d3fb2c740","Type":"ContainerDied","Data":"63082d06d754858f7393926be38640a7952c235118ca7c7524553c8472b30a58"} Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.190617 4830 generic.go:334] "Generic (PLEG): container finished" podID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerID="9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274" exitCode=0 Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.190668 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6st5" event={"ID":"b6155028-4ba3-48be-b83d-7bbe65f28ba7","Type":"ContainerDied","Data":"9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274"} Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.195595 4830 generic.go:334] "Generic (PLEG): container finished" podID="c61d535e-afb5-4006-a758-8bba8735a860" containerID="742a30f6c6d64fe69e0bd3b024a5622c43a61519086e37cef7c11ca146eed70f" exitCode=0 Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.195639 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" event={"ID":"c61d535e-afb5-4006-a758-8bba8735a860","Type":"ContainerDied","Data":"742a30f6c6d64fe69e0bd3b024a5622c43a61519086e37cef7c11ca146eed70f"} Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.195681 4830 scope.go:117] "RemoveContainer" containerID="e908d7bcbb7cc0c0a321bddd4f7137df0885dfd2db6cf98d4f3fc7e53b9125da" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.247308 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74cca8dd-8cb4-41db-9626-9612877ad60e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.247347 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74cca8dd-8cb4-41db-9626-9612877ad60e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.247389 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9z7t\" (UniqueName: \"kubernetes.io/projected/74cca8dd-8cb4-41db-9626-9612877ad60e-kube-api-access-q9z7t\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.251134 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74cca8dd-8cb4-41db-9626-9612877ad60e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.261028 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/74cca8dd-8cb4-41db-9626-9612877ad60e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.266674 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9z7t\" (UniqueName: \"kubernetes.io/projected/74cca8dd-8cb4-41db-9626-9612877ad60e-kube-api-access-q9z7t\") pod \"marketplace-operator-79b997595-vjsls\" (UID: \"74cca8dd-8cb4-41db-9626-9612877ad60e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.469287 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.475610 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.481566 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.494812 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.497178 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.501387 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549623 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-operator-metrics\") pod \"c61d535e-afb5-4006-a758-8bba8735a860\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549686 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-catalog-content\") pod \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549709 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-utilities\") pod \"7febf059-370d-4a68-a543-3b23879ba479\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549732 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx2vk\" (UniqueName: \"kubernetes.io/projected/c61d535e-afb5-4006-a758-8bba8735a860-kube-api-access-fx2vk\") pod \"c61d535e-afb5-4006-a758-8bba8735a860\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549751 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-catalog-content\") pod \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549780 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmw9f\" (UniqueName: \"kubernetes.io/projected/b6155028-4ba3-48be-b83d-7bbe65f28ba7-kube-api-access-qmw9f\") pod \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549798 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dv6z\" (UniqueName: \"kubernetes.io/projected/44447137-e7c5-4a07-bcdd-6bcc4c835f79-kube-api-access-7dv6z\") pod \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549837 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhjg4\" (UniqueName: \"kubernetes.io/projected/7febf059-370d-4a68-a543-3b23879ba479-kube-api-access-vhjg4\") pod \"7febf059-370d-4a68-a543-3b23879ba479\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549864 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-utilities\") pod \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\" (UID: \"b6155028-4ba3-48be-b83d-7bbe65f28ba7\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549904 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-catalog-content\") pod \"7febf059-370d-4a68-a543-3b23879ba479\" (UID: \"7febf059-370d-4a68-a543-3b23879ba479\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549946 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8wsg\" (UniqueName: \"kubernetes.io/projected/2438f79c-45d2-4b4f-951b-630d3fb2c740-kube-api-access-s8wsg\") pod \"2438f79c-45d2-4b4f-951b-630d3fb2c740\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549967 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-catalog-content\") pod \"2438f79c-45d2-4b4f-951b-630d3fb2c740\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.549987 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-trusted-ca\") pod \"c61d535e-afb5-4006-a758-8bba8735a860\" (UID: \"c61d535e-afb5-4006-a758-8bba8735a860\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.550007 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-utilities\") pod \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\" (UID: \"44447137-e7c5-4a07-bcdd-6bcc4c835f79\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.550049 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-utilities\") pod \"2438f79c-45d2-4b4f-951b-630d3fb2c740\" (UID: \"2438f79c-45d2-4b4f-951b-630d3fb2c740\") " Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.555291 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6155028-4ba3-48be-b83d-7bbe65f28ba7-kube-api-access-qmw9f" (OuterVolumeSpecName: "kube-api-access-qmw9f") pod "b6155028-4ba3-48be-b83d-7bbe65f28ba7" (UID: "b6155028-4ba3-48be-b83d-7bbe65f28ba7"). InnerVolumeSpecName "kube-api-access-qmw9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.565723 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44447137-e7c5-4a07-bcdd-6bcc4c835f79-kube-api-access-7dv6z" (OuterVolumeSpecName: "kube-api-access-7dv6z") pod "44447137-e7c5-4a07-bcdd-6bcc4c835f79" (UID: "44447137-e7c5-4a07-bcdd-6bcc4c835f79"). InnerVolumeSpecName "kube-api-access-7dv6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.566561 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c61d535e-afb5-4006-a758-8bba8735a860" (UID: "c61d535e-afb5-4006-a758-8bba8735a860"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.567624 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-utilities" (OuterVolumeSpecName: "utilities") pod "44447137-e7c5-4a07-bcdd-6bcc4c835f79" (UID: "44447137-e7c5-4a07-bcdd-6bcc4c835f79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.568255 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-utilities" (OuterVolumeSpecName: "utilities") pod "7febf059-370d-4a68-a543-3b23879ba479" (UID: "7febf059-370d-4a68-a543-3b23879ba479"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.569381 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-utilities" (OuterVolumeSpecName: "utilities") pod "2438f79c-45d2-4b4f-951b-630d3fb2c740" (UID: "2438f79c-45d2-4b4f-951b-630d3fb2c740"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.570542 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-utilities" (OuterVolumeSpecName: "utilities") pod "b6155028-4ba3-48be-b83d-7bbe65f28ba7" (UID: "b6155028-4ba3-48be-b83d-7bbe65f28ba7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.575186 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7febf059-370d-4a68-a543-3b23879ba479-kube-api-access-vhjg4" (OuterVolumeSpecName: "kube-api-access-vhjg4") pod "7febf059-370d-4a68-a543-3b23879ba479" (UID: "7febf059-370d-4a68-a543-3b23879ba479"). InnerVolumeSpecName "kube-api-access-vhjg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.575304 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61d535e-afb5-4006-a758-8bba8735a860-kube-api-access-fx2vk" (OuterVolumeSpecName: "kube-api-access-fx2vk") pod "c61d535e-afb5-4006-a758-8bba8735a860" (UID: "c61d535e-afb5-4006-a758-8bba8735a860"). InnerVolumeSpecName "kube-api-access-fx2vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.576836 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c61d535e-afb5-4006-a758-8bba8735a860" (UID: "c61d535e-afb5-4006-a758-8bba8735a860"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.578196 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2438f79c-45d2-4b4f-951b-630d3fb2c740-kube-api-access-s8wsg" (OuterVolumeSpecName: "kube-api-access-s8wsg") pod "2438f79c-45d2-4b4f-951b-630d3fb2c740" (UID: "2438f79c-45d2-4b4f-951b-630d3fb2c740"). InnerVolumeSpecName "kube-api-access-s8wsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.587141 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44447137-e7c5-4a07-bcdd-6bcc4c835f79" (UID: "44447137-e7c5-4a07-bcdd-6bcc4c835f79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.628473 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6155028-4ba3-48be-b83d-7bbe65f28ba7" (UID: "b6155028-4ba3-48be-b83d-7bbe65f28ba7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.648964 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7febf059-370d-4a68-a543-3b23879ba479" (UID: "7febf059-370d-4a68-a543-3b23879ba479"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651809 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651838 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8wsg\" (UniqueName: \"kubernetes.io/projected/2438f79c-45d2-4b4f-951b-630d3fb2c740-kube-api-access-s8wsg\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651848 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651857 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651869 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651877 4830 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c61d535e-afb5-4006-a758-8bba8735a860-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651885 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7febf059-370d-4a68-a543-3b23879ba479-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651895 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44447137-e7c5-4a07-bcdd-6bcc4c835f79-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651903 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx2vk\" (UniqueName: \"kubernetes.io/projected/c61d535e-afb5-4006-a758-8bba8735a860-kube-api-access-fx2vk\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651912 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651922 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmw9f\" (UniqueName: \"kubernetes.io/projected/b6155028-4ba3-48be-b83d-7bbe65f28ba7-kube-api-access-qmw9f\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651934 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dv6z\" (UniqueName: \"kubernetes.io/projected/44447137-e7c5-4a07-bcdd-6bcc4c835f79-kube-api-access-7dv6z\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651943 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhjg4\" (UniqueName: \"kubernetes.io/projected/7febf059-370d-4a68-a543-3b23879ba479-kube-api-access-vhjg4\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.651953 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6155028-4ba3-48be-b83d-7bbe65f28ba7-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.712475 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2438f79c-45d2-4b4f-951b-630d3fb2c740" (UID: "2438f79c-45d2-4b4f-951b-630d3fb2c740"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.752908 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2438f79c-45d2-4b4f-951b-630d3fb2c740-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:21:33 crc kubenswrapper[4830]: I0311 09:21:33.885205 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjsls"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.203107 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" event={"ID":"c61d535e-afb5-4006-a758-8bba8735a860","Type":"ContainerDied","Data":"7ae89d9a229f8f2e3380ede3842d1b0e91bfbefa876a9344f3df94d5ca4e794b"} Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.203159 4830 scope.go:117] "RemoveContainer" containerID="742a30f6c6d64fe69e0bd3b024a5622c43a61519086e37cef7c11ca146eed70f" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.203241 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pplbq" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.205473 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" event={"ID":"74cca8dd-8cb4-41db-9626-9612877ad60e","Type":"ContainerStarted","Data":"125e9f416bf19ff1b34e1571184ea252a3982afc3661bd61445b916a10a29d23"} Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.205640 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.205776 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" event={"ID":"74cca8dd-8cb4-41db-9626-9612877ad60e","Type":"ContainerStarted","Data":"ee6b51819eb44d59450af92c6a1dda31ba1aa0906494a0b6e7fddcd7c04729c4"} Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.207173 4830 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vjsls container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" start-of-body= Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.207226 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" podUID="74cca8dd-8cb4-41db-9626-9612877ad60e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.207935 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5qgv" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.207937 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5qgv" event={"ID":"44447137-e7c5-4a07-bcdd-6bcc4c835f79","Type":"ContainerDied","Data":"610c149787db87222d534cd474c89236f4a7fbb80e9d21c6f6a0f37a5d178f9a"} Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.211359 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9zpp" event={"ID":"7febf059-370d-4a68-a543-3b23879ba479","Type":"ContainerDied","Data":"3a0863179e12d91cb364b7003106964135055c075bb0a2a4790b50b498f254f3"} Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.211376 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9zpp" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.220492 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ctcw6" event={"ID":"2438f79c-45d2-4b4f-951b-630d3fb2c740","Type":"ContainerDied","Data":"b56fa36c75fca2ce62a168f50bd57712ff1c5eacd51f1276a6625310ec00011d"} Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.220792 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ctcw6" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.235368 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6st5" event={"ID":"b6155028-4ba3-48be-b83d-7bbe65f28ba7","Type":"ContainerDied","Data":"2b591f42d4c33f8d90a60ff1d77e715510852fd9db49ed1d2251b92629b1a778"} Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.235493 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6st5" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.236409 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" podStartSLOduration=2.236385864 podStartE2EDuration="2.236385864s" podCreationTimestamp="2026-03-11 09:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:21:34.230226217 +0000 UTC m=+462.011376906" watchObservedRunningTime="2026-03-11 09:21:34.236385864 +0000 UTC m=+462.017536563" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.243497 4830 scope.go:117] "RemoveContainer" containerID="81df5ae2df36edb1b9c15a9635b190d74b30a0827b8307a98022c6328c0c3a87" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.275722 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qgv"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.277793 4830 scope.go:117] "RemoveContainer" containerID="f7857ec02c55ae21e66269ceca325950789db592646ee7356fc0c84b682d958b" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.282064 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5qgv"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.307722 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pplbq"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.310592 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pplbq"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.312513 4830 scope.go:117] "RemoveContainer" containerID="17e4c1afc289200a7b31eaa35854ad7b113c5842a8bc517cc8073b90b972c75f" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.319781 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x9zpp"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.323734 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x9zpp"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.327455 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6st5"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.333583 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m6st5"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.335347 4830 scope.go:117] "RemoveContainer" containerID="4fabe6c2638348f3f715899a472a45c820512f348470217255228292ad020a41" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.337635 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ctcw6"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.340192 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ctcw6"] Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.352436 4830 scope.go:117] "RemoveContainer" containerID="e6f5008ca1aef966c2e2824f61827c11e1daf475cf560fb0f418531d4342f63f" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.371839 4830 scope.go:117] "RemoveContainer" containerID="ba13d6dd3b18dc83f39343ace3841bc0c5cfd9957c62021f9e678ccae4cea5bb" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.386238 4830 scope.go:117] "RemoveContainer" containerID="63082d06d754858f7393926be38640a7952c235118ca7c7524553c8472b30a58" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.407162 4830 scope.go:117] "RemoveContainer" containerID="d1b1085e5fe00d505c1be4487326aac211bdf6d3493086835a5738984048e1fd" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.432301 4830 scope.go:117] "RemoveContainer" containerID="b7253f3f04390e521eed635ac3c777bec154573f4ee5d1b2a869a993d1e12bd5" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.446513 4830 scope.go:117] "RemoveContainer" containerID="9bbde5edb33810b79ae0b7b111389b1b1afe0a8d828c9c243f7975ed772eb274" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.458504 4830 scope.go:117] "RemoveContainer" containerID="f82588298a3aa4fe180efb0045e87538178318511717cec88b7ce49535e6395e" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.470247 4830 scope.go:117] "RemoveContainer" containerID="4e601c1281cd1b3adae8a51281dd286c73a2a6d816fc5c6cdbe081c13e8aab8a" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.943415 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" path="/var/lib/kubelet/pods/2438f79c-45d2-4b4f-951b-630d3fb2c740/volumes" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.947148 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" path="/var/lib/kubelet/pods/44447137-e7c5-4a07-bcdd-6bcc4c835f79/volumes" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.948012 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7febf059-370d-4a68-a543-3b23879ba479" path="/var/lib/kubelet/pods/7febf059-370d-4a68-a543-3b23879ba479/volumes" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.948910 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" path="/var/lib/kubelet/pods/b6155028-4ba3-48be-b83d-7bbe65f28ba7/volumes" Mar 11 09:21:34 crc kubenswrapper[4830]: I0311 09:21:34.950547 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61d535e-afb5-4006-a758-8bba8735a860" path="/var/lib/kubelet/pods/c61d535e-afb5-4006-a758-8bba8735a860/volumes" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.190867 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b525m"] Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191188 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191210 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191228 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191240 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191258 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191271 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191294 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191305 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191319 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191330 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191348 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191359 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191551 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191567 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191604 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191617 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191637 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191649 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191668 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191680 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="extract-utilities" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191697 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191708 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191726 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191740 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="extract-content" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.191756 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191768 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191917 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="44447137-e7c5-4a07-bcdd-6bcc4c835f79" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191947 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191968 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2438f79c-45d2-4b4f-951b-630d3fb2c740" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.191985 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.192006 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6155028-4ba3-48be-b83d-7bbe65f28ba7" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.192045 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7febf059-370d-4a68-a543-3b23879ba479" containerName="registry-server" Mar 11 09:21:35 crc kubenswrapper[4830]: E0311 09:21:35.192229 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.192245 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61d535e-afb5-4006-a758-8bba8735a860" containerName="marketplace-operator" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.193493 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.200885 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b525m"] Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.203814 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.252530 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vjsls" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.279820 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/878a9e87-f5ea-4c96-8439-28ccc445778b-catalog-content\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.279897 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldwff\" (UniqueName: \"kubernetes.io/projected/878a9e87-f5ea-4c96-8439-28ccc445778b-kube-api-access-ldwff\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.280090 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/878a9e87-f5ea-4c96-8439-28ccc445778b-utilities\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.375634 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5ljms"] Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.377053 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.379611 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.380957 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/878a9e87-f5ea-4c96-8439-28ccc445778b-utilities\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.381112 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/878a9e87-f5ea-4c96-8439-28ccc445778b-catalog-content\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.381134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldwff\" (UniqueName: \"kubernetes.io/projected/878a9e87-f5ea-4c96-8439-28ccc445778b-kube-api-access-ldwff\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.381995 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/878a9e87-f5ea-4c96-8439-28ccc445778b-catalog-content\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.382408 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/878a9e87-f5ea-4c96-8439-28ccc445778b-utilities\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.391702 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ljms"] Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.415826 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldwff\" (UniqueName: \"kubernetes.io/projected/878a9e87-f5ea-4c96-8439-28ccc445778b-kube-api-access-ldwff\") pod \"redhat-marketplace-b525m\" (UID: \"878a9e87-f5ea-4c96-8439-28ccc445778b\") " pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.482805 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb5kq\" (UniqueName: \"kubernetes.io/projected/2635fd4f-dc7a-4524-bf2d-6307f49363c9-kube-api-access-gb5kq\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.482853 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-utilities\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.482890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-catalog-content\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.520665 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.584124 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb5kq\" (UniqueName: \"kubernetes.io/projected/2635fd4f-dc7a-4524-bf2d-6307f49363c9-kube-api-access-gb5kq\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.584440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-utilities\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.584492 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-catalog-content\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.584920 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-utilities\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.585041 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-catalog-content\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.618175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb5kq\" (UniqueName: \"kubernetes.io/projected/2635fd4f-dc7a-4524-bf2d-6307f49363c9-kube-api-access-gb5kq\") pod \"certified-operators-5ljms\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.699633 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.732668 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b525m"] Mar 11 09:21:35 crc kubenswrapper[4830]: W0311 09:21:35.738438 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod878a9e87_f5ea_4c96_8439_28ccc445778b.slice/crio-98b26bfd3bfde66ce280daae8c3b30ede3439ccdb86aab28142ce5c0f0cd2a02 WatchSource:0}: Error finding container 98b26bfd3bfde66ce280daae8c3b30ede3439ccdb86aab28142ce5c0f0cd2a02: Status 404 returned error can't find the container with id 98b26bfd3bfde66ce280daae8c3b30ede3439ccdb86aab28142ce5c0f0cd2a02 Mar 11 09:21:35 crc kubenswrapper[4830]: I0311 09:21:35.958477 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ljms"] Mar 11 09:21:35 crc kubenswrapper[4830]: W0311 09:21:35.979430 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2635fd4f_dc7a_4524_bf2d_6307f49363c9.slice/crio-e120c111fd30f51277128c7972bf7915dc62c04b82b1ee25b116c486f095463d WatchSource:0}: Error finding container e120c111fd30f51277128c7972bf7915dc62c04b82b1ee25b116c486f095463d: Status 404 returned error can't find the container with id e120c111fd30f51277128c7972bf7915dc62c04b82b1ee25b116c486f095463d Mar 11 09:21:36 crc kubenswrapper[4830]: I0311 09:21:36.257491 4830 generic.go:334] "Generic (PLEG): container finished" podID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerID="323cfffe2a76ceb6926439e0777c22f7c6fb9dfdb6a2336bd6f147f6970ceb6c" exitCode=0 Mar 11 09:21:36 crc kubenswrapper[4830]: I0311 09:21:36.257536 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljms" event={"ID":"2635fd4f-dc7a-4524-bf2d-6307f49363c9","Type":"ContainerDied","Data":"323cfffe2a76ceb6926439e0777c22f7c6fb9dfdb6a2336bd6f147f6970ceb6c"} Mar 11 09:21:36 crc kubenswrapper[4830]: I0311 09:21:36.257579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljms" event={"ID":"2635fd4f-dc7a-4524-bf2d-6307f49363c9","Type":"ContainerStarted","Data":"e120c111fd30f51277128c7972bf7915dc62c04b82b1ee25b116c486f095463d"} Mar 11 09:21:36 crc kubenswrapper[4830]: I0311 09:21:36.259188 4830 generic.go:334] "Generic (PLEG): container finished" podID="878a9e87-f5ea-4c96-8439-28ccc445778b" containerID="16177ff95eb8a1341564adaaa327aebce58a349cd46bf2f45221cac69ae605d1" exitCode=0 Mar 11 09:21:36 crc kubenswrapper[4830]: I0311 09:21:36.259291 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b525m" event={"ID":"878a9e87-f5ea-4c96-8439-28ccc445778b","Type":"ContainerDied","Data":"16177ff95eb8a1341564adaaa327aebce58a349cd46bf2f45221cac69ae605d1"} Mar 11 09:21:36 crc kubenswrapper[4830]: I0311 09:21:36.259347 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b525m" event={"ID":"878a9e87-f5ea-4c96-8439-28ccc445778b","Type":"ContainerStarted","Data":"98b26bfd3bfde66ce280daae8c3b30ede3439ccdb86aab28142ce5c0f0cd2a02"} Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.266195 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljms" event={"ID":"2635fd4f-dc7a-4524-bf2d-6307f49363c9","Type":"ContainerStarted","Data":"6e1114c70394434abf4089f3b9d9141ed415f68925e14b3295c44bfcf213ae1e"} Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.584139 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qn82n"] Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.585327 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.594766 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.600396 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qn82n"] Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.714507 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-utilities\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.714557 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdg92\" (UniqueName: \"kubernetes.io/projected/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-kube-api-access-zdg92\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.714578 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-catalog-content\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.783437 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nt9xr"] Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.785251 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.791766 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.797307 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nt9xr"] Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.815894 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-utilities\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.815951 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdg92\" (UniqueName: \"kubernetes.io/projected/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-kube-api-access-zdg92\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.815977 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-catalog-content\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.816395 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-catalog-content\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.816595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-utilities\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.840494 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdg92\" (UniqueName: \"kubernetes.io/projected/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-kube-api-access-zdg92\") pod \"redhat-operators-qn82n\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.917489 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdssk\" (UniqueName: \"kubernetes.io/projected/47ec45e2-dfcb-4c82-955a-0f820e2d0210-kube-api-access-vdssk\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.917780 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ec45e2-dfcb-4c82-955a-0f820e2d0210-catalog-content\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.917937 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ec45e2-dfcb-4c82-955a-0f820e2d0210-utilities\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:37 crc kubenswrapper[4830]: I0311 09:21:37.918390 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.019705 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ec45e2-dfcb-4c82-955a-0f820e2d0210-catalog-content\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.020212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ec45e2-dfcb-4c82-955a-0f820e2d0210-utilities\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.020570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ec45e2-dfcb-4c82-955a-0f820e2d0210-catalog-content\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.020860 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ec45e2-dfcb-4c82-955a-0f820e2d0210-utilities\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.021095 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdssk\" (UniqueName: \"kubernetes.io/projected/47ec45e2-dfcb-4c82-955a-0f820e2d0210-kube-api-access-vdssk\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.043752 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdssk\" (UniqueName: \"kubernetes.io/projected/47ec45e2-dfcb-4c82-955a-0f820e2d0210-kube-api-access-vdssk\") pod \"community-operators-nt9xr\" (UID: \"47ec45e2-dfcb-4c82-955a-0f820e2d0210\") " pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.104942 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.110541 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qn82n"] Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.274936 4830 generic.go:334] "Generic (PLEG): container finished" podID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerID="6e1114c70394434abf4089f3b9d9141ed415f68925e14b3295c44bfcf213ae1e" exitCode=0 Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.275300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljms" event={"ID":"2635fd4f-dc7a-4524-bf2d-6307f49363c9","Type":"ContainerDied","Data":"6e1114c70394434abf4089f3b9d9141ed415f68925e14b3295c44bfcf213ae1e"} Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.278061 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn82n" event={"ID":"86ab19b6-db7c-4c64-a5cb-cc60d48e1570","Type":"ContainerStarted","Data":"5dd45e289ff48d0e5b02f1dfa232b824736d9a061e8183e74916274a005eb00f"} Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.278100 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn82n" event={"ID":"86ab19b6-db7c-4c64-a5cb-cc60d48e1570","Type":"ContainerStarted","Data":"65138cbca454c2616a8b8fe79277c3bf3cb5d2180c60a48fb07b66f366065d8f"} Mar 11 09:21:38 crc kubenswrapper[4830]: I0311 09:21:38.298984 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nt9xr"] Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.288643 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljms" event={"ID":"2635fd4f-dc7a-4524-bf2d-6307f49363c9","Type":"ContainerStarted","Data":"db72a4b45d8609aa4b31c10f192a8c101361b5bbb2227097178de533a6c8c11e"} Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.290991 4830 generic.go:334] "Generic (PLEG): container finished" podID="878a9e87-f5ea-4c96-8439-28ccc445778b" containerID="0b8715af96a00fa8cc8365526eb7ccaad5f21f0d392781b7896c218585fbea7e" exitCode=0 Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.291038 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b525m" event={"ID":"878a9e87-f5ea-4c96-8439-28ccc445778b","Type":"ContainerDied","Data":"0b8715af96a00fa8cc8365526eb7ccaad5f21f0d392781b7896c218585fbea7e"} Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.292651 4830 generic.go:334] "Generic (PLEG): container finished" podID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerID="5dd45e289ff48d0e5b02f1dfa232b824736d9a061e8183e74916274a005eb00f" exitCode=0 Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.292687 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn82n" event={"ID":"86ab19b6-db7c-4c64-a5cb-cc60d48e1570","Type":"ContainerDied","Data":"5dd45e289ff48d0e5b02f1dfa232b824736d9a061e8183e74916274a005eb00f"} Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.295866 4830 generic.go:334] "Generic (PLEG): container finished" podID="47ec45e2-dfcb-4c82-955a-0f820e2d0210" containerID="18640f69990802793c49008c5e82cfeecf522b38f991152bc8616770c187c678" exitCode=0 Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.295903 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nt9xr" event={"ID":"47ec45e2-dfcb-4c82-955a-0f820e2d0210","Type":"ContainerDied","Data":"18640f69990802793c49008c5e82cfeecf522b38f991152bc8616770c187c678"} Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.295928 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nt9xr" event={"ID":"47ec45e2-dfcb-4c82-955a-0f820e2d0210","Type":"ContainerStarted","Data":"8b53423bfe036aeeef69e8438d60b6a4db5a2fb5c4d51a921015eb281c8233a4"} Mar 11 09:21:39 crc kubenswrapper[4830]: I0311 09:21:39.319625 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5ljms" podStartSLOduration=1.8632757290000002 podStartE2EDuration="4.319599041s" podCreationTimestamp="2026-03-11 09:21:35 +0000 UTC" firstStartedPulling="2026-03-11 09:21:36.267867928 +0000 UTC m=+464.049018617" lastFinishedPulling="2026-03-11 09:21:38.72419124 +0000 UTC m=+466.505341929" observedRunningTime="2026-03-11 09:21:39.311928482 +0000 UTC m=+467.093079201" watchObservedRunningTime="2026-03-11 09:21:39.319599041 +0000 UTC m=+467.100749760" Mar 11 09:21:40 crc kubenswrapper[4830]: I0311 09:21:40.302667 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b525m" event={"ID":"878a9e87-f5ea-4c96-8439-28ccc445778b","Type":"ContainerStarted","Data":"01efa62ae7c2e0d74c2950465aae783d65279191e3583bf4f14e97f8922b5110"} Mar 11 09:21:40 crc kubenswrapper[4830]: I0311 09:21:40.327814 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b525m" podStartSLOduration=1.694824325 podStartE2EDuration="5.327789232s" podCreationTimestamp="2026-03-11 09:21:35 +0000 UTC" firstStartedPulling="2026-03-11 09:21:36.267982771 +0000 UTC m=+464.049133460" lastFinishedPulling="2026-03-11 09:21:39.900947638 +0000 UTC m=+467.682098367" observedRunningTime="2026-03-11 09:21:40.325340265 +0000 UTC m=+468.106490974" watchObservedRunningTime="2026-03-11 09:21:40.327789232 +0000 UTC m=+468.108939921" Mar 11 09:21:41 crc kubenswrapper[4830]: I0311 09:21:41.309884 4830 generic.go:334] "Generic (PLEG): container finished" podID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerID="ce4bac958f32797ea6fe525f84e94b8409440d09c97b9a0588f6dd265a6a8fee" exitCode=0 Mar 11 09:21:41 crc kubenswrapper[4830]: I0311 09:21:41.309933 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn82n" event={"ID":"86ab19b6-db7c-4c64-a5cb-cc60d48e1570","Type":"ContainerDied","Data":"ce4bac958f32797ea6fe525f84e94b8409440d09c97b9a0588f6dd265a6a8fee"} Mar 11 09:21:41 crc kubenswrapper[4830]: I0311 09:21:41.317947 4830 generic.go:334] "Generic (PLEG): container finished" podID="47ec45e2-dfcb-4c82-955a-0f820e2d0210" containerID="314f73632b63ba7c0bc26909ed3b9acdcbecd5202772f4e9319d0a0cb120b1a0" exitCode=0 Mar 11 09:21:41 crc kubenswrapper[4830]: I0311 09:21:41.318437 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nt9xr" event={"ID":"47ec45e2-dfcb-4c82-955a-0f820e2d0210","Type":"ContainerDied","Data":"314f73632b63ba7c0bc26909ed3b9acdcbecd5202772f4e9319d0a0cb120b1a0"} Mar 11 09:21:42 crc kubenswrapper[4830]: I0311 09:21:42.328761 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nt9xr" event={"ID":"47ec45e2-dfcb-4c82-955a-0f820e2d0210","Type":"ContainerStarted","Data":"c83082479cbb3e4e1738527b3116c06635fb915c2ffd91918db2f5088f41abcd"} Mar 11 09:21:42 crc kubenswrapper[4830]: I0311 09:21:42.331541 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn82n" event={"ID":"86ab19b6-db7c-4c64-a5cb-cc60d48e1570","Type":"ContainerStarted","Data":"ef8f038907985318cf60fb400b083a35a9e9dd64392811c9742322d6bf82d113"} Mar 11 09:21:42 crc kubenswrapper[4830]: I0311 09:21:42.356718 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nt9xr" podStartSLOduration=2.938633844 podStartE2EDuration="5.356699365s" podCreationTimestamp="2026-03-11 09:21:37 +0000 UTC" firstStartedPulling="2026-03-11 09:21:39.300530352 +0000 UTC m=+467.081681071" lastFinishedPulling="2026-03-11 09:21:41.718595893 +0000 UTC m=+469.499746592" observedRunningTime="2026-03-11 09:21:42.351657908 +0000 UTC m=+470.132808627" watchObservedRunningTime="2026-03-11 09:21:42.356699365 +0000 UTC m=+470.137850074" Mar 11 09:21:42 crc kubenswrapper[4830]: I0311 09:21:42.373613 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qn82n" podStartSLOduration=2.957132697 podStartE2EDuration="5.373589425s" podCreationTimestamp="2026-03-11 09:21:37 +0000 UTC" firstStartedPulling="2026-03-11 09:21:39.293953852 +0000 UTC m=+467.075104571" lastFinishedPulling="2026-03-11 09:21:41.71041056 +0000 UTC m=+469.491561299" observedRunningTime="2026-03-11 09:21:42.371136648 +0000 UTC m=+470.152287367" watchObservedRunningTime="2026-03-11 09:21:42.373589425 +0000 UTC m=+470.154740154" Mar 11 09:21:43 crc kubenswrapper[4830]: I0311 09:21:43.060008 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:21:43 crc kubenswrapper[4830]: I0311 09:21:43.060141 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:21:45 crc kubenswrapper[4830]: I0311 09:21:45.521341 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:45 crc kubenswrapper[4830]: I0311 09:21:45.521760 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:45 crc kubenswrapper[4830]: I0311 09:21:45.581566 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:45 crc kubenswrapper[4830]: I0311 09:21:45.701091 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:45 crc kubenswrapper[4830]: I0311 09:21:45.701156 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:45 crc kubenswrapper[4830]: I0311 09:21:45.775581 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:46 crc kubenswrapper[4830]: I0311 09:21:46.433701 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:21:46 crc kubenswrapper[4830]: I0311 09:21:46.436838 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b525m" Mar 11 09:21:47 crc kubenswrapper[4830]: I0311 09:21:47.919127 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:47 crc kubenswrapper[4830]: I0311 09:21:47.921936 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:48 crc kubenswrapper[4830]: I0311 09:21:48.105947 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:48 crc kubenswrapper[4830]: I0311 09:21:48.106013 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:48 crc kubenswrapper[4830]: I0311 09:21:48.152101 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:48 crc kubenswrapper[4830]: I0311 09:21:48.434389 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nt9xr" Mar 11 09:21:48 crc kubenswrapper[4830]: I0311 09:21:48.980005 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qn82n" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="registry-server" probeResult="failure" output=< Mar 11 09:21:48 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 09:21:48 crc kubenswrapper[4830]: > Mar 11 09:21:57 crc kubenswrapper[4830]: I0311 09:21:57.997484 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:21:58 crc kubenswrapper[4830]: I0311 09:21:58.071130 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.133212 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553682-rnxn9"] Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.134389 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553682-rnxn9" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.136000 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.141576 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.141763 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.142795 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553682-rnxn9"] Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.181714 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz5ss\" (UniqueName: \"kubernetes.io/projected/a5b49631-e712-4c93-afc6-59aff1ef208c-kube-api-access-wz5ss\") pod \"auto-csr-approver-29553682-rnxn9\" (UID: \"a5b49631-e712-4c93-afc6-59aff1ef208c\") " pod="openshift-infra/auto-csr-approver-29553682-rnxn9" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.282669 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz5ss\" (UniqueName: \"kubernetes.io/projected/a5b49631-e712-4c93-afc6-59aff1ef208c-kube-api-access-wz5ss\") pod \"auto-csr-approver-29553682-rnxn9\" (UID: \"a5b49631-e712-4c93-afc6-59aff1ef208c\") " pod="openshift-infra/auto-csr-approver-29553682-rnxn9" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.312886 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz5ss\" (UniqueName: \"kubernetes.io/projected/a5b49631-e712-4c93-afc6-59aff1ef208c-kube-api-access-wz5ss\") pod \"auto-csr-approver-29553682-rnxn9\" (UID: \"a5b49631-e712-4c93-afc6-59aff1ef208c\") " pod="openshift-infra/auto-csr-approver-29553682-rnxn9" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.459790 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553682-rnxn9" Mar 11 09:22:00 crc kubenswrapper[4830]: I0311 09:22:00.927971 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553682-rnxn9"] Mar 11 09:22:00 crc kubenswrapper[4830]: W0311 09:22:00.938140 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5b49631_e712_4c93_afc6_59aff1ef208c.slice/crio-06fea1cbeb488fe1093adce9837eeed6cd2c246da91e17a0836fd5dc933171ae WatchSource:0}: Error finding container 06fea1cbeb488fe1093adce9837eeed6cd2c246da91e17a0836fd5dc933171ae: Status 404 returned error can't find the container with id 06fea1cbeb488fe1093adce9837eeed6cd2c246da91e17a0836fd5dc933171ae Mar 11 09:22:01 crc kubenswrapper[4830]: I0311 09:22:01.479399 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553682-rnxn9" event={"ID":"a5b49631-e712-4c93-afc6-59aff1ef208c","Type":"ContainerStarted","Data":"06fea1cbeb488fe1093adce9837eeed6cd2c246da91e17a0836fd5dc933171ae"} Mar 11 09:22:03 crc kubenswrapper[4830]: I0311 09:22:03.500629 4830 generic.go:334] "Generic (PLEG): container finished" podID="a5b49631-e712-4c93-afc6-59aff1ef208c" containerID="35010b390229211ea793b8622ff91560faff48c170c0b0052342b1f8ee3ad633" exitCode=0 Mar 11 09:22:03 crc kubenswrapper[4830]: I0311 09:22:03.500792 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553682-rnxn9" event={"ID":"a5b49631-e712-4c93-afc6-59aff1ef208c","Type":"ContainerDied","Data":"35010b390229211ea793b8622ff91560faff48c170c0b0052342b1f8ee3ad633"} Mar 11 09:22:05 crc kubenswrapper[4830]: I0311 09:22:05.512484 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553682-rnxn9" event={"ID":"a5b49631-e712-4c93-afc6-59aff1ef208c","Type":"ContainerDied","Data":"06fea1cbeb488fe1093adce9837eeed6cd2c246da91e17a0836fd5dc933171ae"} Mar 11 09:22:05 crc kubenswrapper[4830]: I0311 09:22:05.513111 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06fea1cbeb488fe1093adce9837eeed6cd2c246da91e17a0836fd5dc933171ae" Mar 11 09:22:05 crc kubenswrapper[4830]: I0311 09:22:05.574814 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553682-rnxn9" Mar 11 09:22:05 crc kubenswrapper[4830]: I0311 09:22:05.642437 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz5ss\" (UniqueName: \"kubernetes.io/projected/a5b49631-e712-4c93-afc6-59aff1ef208c-kube-api-access-wz5ss\") pod \"a5b49631-e712-4c93-afc6-59aff1ef208c\" (UID: \"a5b49631-e712-4c93-afc6-59aff1ef208c\") " Mar 11 09:22:05 crc kubenswrapper[4830]: I0311 09:22:05.648966 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b49631-e712-4c93-afc6-59aff1ef208c-kube-api-access-wz5ss" (OuterVolumeSpecName: "kube-api-access-wz5ss") pod "a5b49631-e712-4c93-afc6-59aff1ef208c" (UID: "a5b49631-e712-4c93-afc6-59aff1ef208c"). InnerVolumeSpecName "kube-api-access-wz5ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:22:05 crc kubenswrapper[4830]: I0311 09:22:05.743908 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz5ss\" (UniqueName: \"kubernetes.io/projected/a5b49631-e712-4c93-afc6-59aff1ef208c-kube-api-access-wz5ss\") on node \"crc\" DevicePath \"\"" Mar 11 09:22:06 crc kubenswrapper[4830]: I0311 09:22:06.518108 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553682-rnxn9" Mar 11 09:22:06 crc kubenswrapper[4830]: I0311 09:22:06.642289 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553676-chghx"] Mar 11 09:22:06 crc kubenswrapper[4830]: I0311 09:22:06.645393 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553676-chghx"] Mar 11 09:22:06 crc kubenswrapper[4830]: I0311 09:22:06.941211 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680233cf-fda8-402e-95a6-a596a0edd470" path="/var/lib/kubelet/pods/680233cf-fda8-402e-95a6-a596a0edd470/volumes" Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.060333 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.061122 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.061203 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.062117 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63a6a7ac35cab5a7e04856357852e727639b515067afed5509adcf1a3d3bc6f0"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.062224 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://63a6a7ac35cab5a7e04856357852e727639b515067afed5509adcf1a3d3bc6f0" gracePeriod=600 Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.564470 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="63a6a7ac35cab5a7e04856357852e727639b515067afed5509adcf1a3d3bc6f0" exitCode=0 Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.564538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"63a6a7ac35cab5a7e04856357852e727639b515067afed5509adcf1a3d3bc6f0"} Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.564739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"142e36b46036713ff5cf010b0bd983e98265d8b1e5bd25a219605acc4cae5ae2"} Mar 11 09:22:13 crc kubenswrapper[4830]: I0311 09:22:13.564767 4830 scope.go:117] "RemoveContainer" containerID="a6809543f23d90ca1f7c68031d8d13e2ea98c26b5e48957e15fadac93873a241" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.140814 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553684-clgcq"] Mar 11 09:24:00 crc kubenswrapper[4830]: E0311 09:24:00.141705 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b49631-e712-4c93-afc6-59aff1ef208c" containerName="oc" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.141721 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b49631-e712-4c93-afc6-59aff1ef208c" containerName="oc" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.141837 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b49631-e712-4c93-afc6-59aff1ef208c" containerName="oc" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.142344 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553684-clgcq" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.145188 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.148336 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.149467 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.162362 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553684-clgcq"] Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.193477 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h784d\" (UniqueName: \"kubernetes.io/projected/23a16aaf-f956-485d-ba92-0bc09cd6af26-kube-api-access-h784d\") pod \"auto-csr-approver-29553684-clgcq\" (UID: \"23a16aaf-f956-485d-ba92-0bc09cd6af26\") " pod="openshift-infra/auto-csr-approver-29553684-clgcq" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.294596 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h784d\" (UniqueName: \"kubernetes.io/projected/23a16aaf-f956-485d-ba92-0bc09cd6af26-kube-api-access-h784d\") pod \"auto-csr-approver-29553684-clgcq\" (UID: \"23a16aaf-f956-485d-ba92-0bc09cd6af26\") " pod="openshift-infra/auto-csr-approver-29553684-clgcq" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.325945 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h784d\" (UniqueName: \"kubernetes.io/projected/23a16aaf-f956-485d-ba92-0bc09cd6af26-kube-api-access-h784d\") pod \"auto-csr-approver-29553684-clgcq\" (UID: \"23a16aaf-f956-485d-ba92-0bc09cd6af26\") " pod="openshift-infra/auto-csr-approver-29553684-clgcq" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.496745 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553684-clgcq" Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.728937 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553684-clgcq"] Mar 11 09:24:00 crc kubenswrapper[4830]: I0311 09:24:00.741260 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:24:01 crc kubenswrapper[4830]: I0311 09:24:01.283254 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553684-clgcq" event={"ID":"23a16aaf-f956-485d-ba92-0bc09cd6af26","Type":"ContainerStarted","Data":"bb7b71368ecc0825b6ae08bb1983c36a84da539025fdb5916bdbe90db8bb3d66"} Mar 11 09:24:02 crc kubenswrapper[4830]: I0311 09:24:02.291195 4830 generic.go:334] "Generic (PLEG): container finished" podID="23a16aaf-f956-485d-ba92-0bc09cd6af26" containerID="07b485ff0fb5b27baeb096561a4f48acad4bb2ead35af12ab7fca4419525a027" exitCode=0 Mar 11 09:24:02 crc kubenswrapper[4830]: I0311 09:24:02.291477 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553684-clgcq" event={"ID":"23a16aaf-f956-485d-ba92-0bc09cd6af26","Type":"ContainerDied","Data":"07b485ff0fb5b27baeb096561a4f48acad4bb2ead35af12ab7fca4419525a027"} Mar 11 09:24:03 crc kubenswrapper[4830]: I0311 09:24:03.527213 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553684-clgcq" Mar 11 09:24:03 crc kubenswrapper[4830]: I0311 09:24:03.537061 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h784d\" (UniqueName: \"kubernetes.io/projected/23a16aaf-f956-485d-ba92-0bc09cd6af26-kube-api-access-h784d\") pod \"23a16aaf-f956-485d-ba92-0bc09cd6af26\" (UID: \"23a16aaf-f956-485d-ba92-0bc09cd6af26\") " Mar 11 09:24:03 crc kubenswrapper[4830]: I0311 09:24:03.545804 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a16aaf-f956-485d-ba92-0bc09cd6af26-kube-api-access-h784d" (OuterVolumeSpecName: "kube-api-access-h784d") pod "23a16aaf-f956-485d-ba92-0bc09cd6af26" (UID: "23a16aaf-f956-485d-ba92-0bc09cd6af26"). InnerVolumeSpecName "kube-api-access-h784d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:24:03 crc kubenswrapper[4830]: I0311 09:24:03.638211 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h784d\" (UniqueName: \"kubernetes.io/projected/23a16aaf-f956-485d-ba92-0bc09cd6af26-kube-api-access-h784d\") on node \"crc\" DevicePath \"\"" Mar 11 09:24:04 crc kubenswrapper[4830]: I0311 09:24:04.310447 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553684-clgcq" event={"ID":"23a16aaf-f956-485d-ba92-0bc09cd6af26","Type":"ContainerDied","Data":"bb7b71368ecc0825b6ae08bb1983c36a84da539025fdb5916bdbe90db8bb3d66"} Mar 11 09:24:04 crc kubenswrapper[4830]: I0311 09:24:04.310492 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb7b71368ecc0825b6ae08bb1983c36a84da539025fdb5916bdbe90db8bb3d66" Mar 11 09:24:04 crc kubenswrapper[4830]: I0311 09:24:04.310569 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553684-clgcq" Mar 11 09:24:04 crc kubenswrapper[4830]: I0311 09:24:04.584438 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553678-nslvc"] Mar 11 09:24:04 crc kubenswrapper[4830]: I0311 09:24:04.590805 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553678-nslvc"] Mar 11 09:24:04 crc kubenswrapper[4830]: I0311 09:24:04.945093 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e577451d-6016-4afc-913a-6d022a9a2f79" path="/var/lib/kubelet/pods/e577451d-6016-4afc-913a-6d022a9a2f79/volumes" Mar 11 09:24:13 crc kubenswrapper[4830]: I0311 09:24:13.061215 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:24:13 crc kubenswrapper[4830]: I0311 09:24:13.061839 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:24:43 crc kubenswrapper[4830]: I0311 09:24:43.060894 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:24:43 crc kubenswrapper[4830]: I0311 09:24:43.061786 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:24:53 crc kubenswrapper[4830]: I0311 09:24:53.518795 4830 scope.go:117] "RemoveContainer" containerID="8e2e106054c1b0bd60e49b1e7ace0ec94301e773ddcc72e373e291a92e695311" Mar 11 09:24:53 crc kubenswrapper[4830]: I0311 09:24:53.560082 4830 scope.go:117] "RemoveContainer" containerID="86dfda9e99c2b7e208ead4e5c5eca235f1e785d63b7a9507bc27070f8ee8c62f" Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.060762 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.061362 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.061410 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.062040 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"142e36b46036713ff5cf010b0bd983e98265d8b1e5bd25a219605acc4cae5ae2"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.062130 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://142e36b46036713ff5cf010b0bd983e98265d8b1e5bd25a219605acc4cae5ae2" gracePeriod=600 Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.755477 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="142e36b46036713ff5cf010b0bd983e98265d8b1e5bd25a219605acc4cae5ae2" exitCode=0 Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.755547 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"142e36b46036713ff5cf010b0bd983e98265d8b1e5bd25a219605acc4cae5ae2"} Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.755968 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"9f1549afce8227de9820039f9dd4bcf657fcc7950e158e1064942fb283e47f6d"} Mar 11 09:25:13 crc kubenswrapper[4830]: I0311 09:25:13.755996 4830 scope.go:117] "RemoveContainer" containerID="63a6a7ac35cab5a7e04856357852e727639b515067afed5509adcf1a3d3bc6f0" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.135129 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553686-xg792"] Mar 11 09:26:00 crc kubenswrapper[4830]: E0311 09:26:00.135968 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a16aaf-f956-485d-ba92-0bc09cd6af26" containerName="oc" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.135984 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a16aaf-f956-485d-ba92-0bc09cd6af26" containerName="oc" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.136122 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a16aaf-f956-485d-ba92-0bc09cd6af26" containerName="oc" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.136497 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553686-xg792"] Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.136586 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553686-xg792" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.138700 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.139320 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.139477 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.298854 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktpz2\" (UniqueName: \"kubernetes.io/projected/e1c18180-afd3-4329-bc8c-8bf32ab5e82c-kube-api-access-ktpz2\") pod \"auto-csr-approver-29553686-xg792\" (UID: \"e1c18180-afd3-4329-bc8c-8bf32ab5e82c\") " pod="openshift-infra/auto-csr-approver-29553686-xg792" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.401043 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktpz2\" (UniqueName: \"kubernetes.io/projected/e1c18180-afd3-4329-bc8c-8bf32ab5e82c-kube-api-access-ktpz2\") pod \"auto-csr-approver-29553686-xg792\" (UID: \"e1c18180-afd3-4329-bc8c-8bf32ab5e82c\") " pod="openshift-infra/auto-csr-approver-29553686-xg792" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.422226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktpz2\" (UniqueName: \"kubernetes.io/projected/e1c18180-afd3-4329-bc8c-8bf32ab5e82c-kube-api-access-ktpz2\") pod \"auto-csr-approver-29553686-xg792\" (UID: \"e1c18180-afd3-4329-bc8c-8bf32ab5e82c\") " pod="openshift-infra/auto-csr-approver-29553686-xg792" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.452915 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553686-xg792" Mar 11 09:26:00 crc kubenswrapper[4830]: I0311 09:26:00.640944 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553686-xg792"] Mar 11 09:26:01 crc kubenswrapper[4830]: I0311 09:26:01.053435 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553686-xg792" event={"ID":"e1c18180-afd3-4329-bc8c-8bf32ab5e82c","Type":"ContainerStarted","Data":"bdd7ce106886cb6701fa90231bdaf14c54927b29e544b65d7c9b3b2dd0aa7fe6"} Mar 11 09:26:02 crc kubenswrapper[4830]: I0311 09:26:02.059603 4830 generic.go:334] "Generic (PLEG): container finished" podID="e1c18180-afd3-4329-bc8c-8bf32ab5e82c" containerID="dcbf17aeb6cc373d6e5b4d9db7d9c4370421143343b6e60a03722cd25a9c1dbd" exitCode=0 Mar 11 09:26:02 crc kubenswrapper[4830]: I0311 09:26:02.059785 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553686-xg792" event={"ID":"e1c18180-afd3-4329-bc8c-8bf32ab5e82c","Type":"ContainerDied","Data":"dcbf17aeb6cc373d6e5b4d9db7d9c4370421143343b6e60a03722cd25a9c1dbd"} Mar 11 09:26:03 crc kubenswrapper[4830]: I0311 09:26:03.365623 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553686-xg792" Mar 11 09:26:03 crc kubenswrapper[4830]: I0311 09:26:03.543780 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktpz2\" (UniqueName: \"kubernetes.io/projected/e1c18180-afd3-4329-bc8c-8bf32ab5e82c-kube-api-access-ktpz2\") pod \"e1c18180-afd3-4329-bc8c-8bf32ab5e82c\" (UID: \"e1c18180-afd3-4329-bc8c-8bf32ab5e82c\") " Mar 11 09:26:03 crc kubenswrapper[4830]: I0311 09:26:03.552347 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c18180-afd3-4329-bc8c-8bf32ab5e82c-kube-api-access-ktpz2" (OuterVolumeSpecName: "kube-api-access-ktpz2") pod "e1c18180-afd3-4329-bc8c-8bf32ab5e82c" (UID: "e1c18180-afd3-4329-bc8c-8bf32ab5e82c"). InnerVolumeSpecName "kube-api-access-ktpz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:26:03 crc kubenswrapper[4830]: I0311 09:26:03.645415 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktpz2\" (UniqueName: \"kubernetes.io/projected/e1c18180-afd3-4329-bc8c-8bf32ab5e82c-kube-api-access-ktpz2\") on node \"crc\" DevicePath \"\"" Mar 11 09:26:04 crc kubenswrapper[4830]: I0311 09:26:04.075630 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553686-xg792" event={"ID":"e1c18180-afd3-4329-bc8c-8bf32ab5e82c","Type":"ContainerDied","Data":"bdd7ce106886cb6701fa90231bdaf14c54927b29e544b65d7c9b3b2dd0aa7fe6"} Mar 11 09:26:04 crc kubenswrapper[4830]: I0311 09:26:04.075693 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdd7ce106886cb6701fa90231bdaf14c54927b29e544b65d7c9b3b2dd0aa7fe6" Mar 11 09:26:04 crc kubenswrapper[4830]: I0311 09:26:04.075715 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553686-xg792" Mar 11 09:26:04 crc kubenswrapper[4830]: I0311 09:26:04.429497 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553680-knd42"] Mar 11 09:26:04 crc kubenswrapper[4830]: I0311 09:26:04.434981 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553680-knd42"] Mar 11 09:26:04 crc kubenswrapper[4830]: I0311 09:26:04.944575 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd5fce1-cc80-4c14-92fc-9c220e5ffa93" path="/var/lib/kubelet/pods/5bd5fce1-cc80-4c14-92fc-9c220e5ffa93/volumes" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.329611 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv"] Mar 11 09:26:48 crc kubenswrapper[4830]: E0311 09:26:48.330430 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c18180-afd3-4329-bc8c-8bf32ab5e82c" containerName="oc" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.330447 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c18180-afd3-4329-bc8c-8bf32ab5e82c" containerName="oc" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.330570 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c18180-afd3-4329-bc8c-8bf32ab5e82c" containerName="oc" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.331011 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.337402 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.337660 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.338580 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8rr2b" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.342617 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv"] Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.348927 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-s7blt"] Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.350200 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-s7blt" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.355001 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-nfwwd" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.364175 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-s7blt"] Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.375820 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xlw9r"] Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.376867 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.381307 4830 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-j4cbb" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.387640 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xlw9r"] Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.471577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brrgs\" (UniqueName: \"kubernetes.io/projected/1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d-kube-api-access-brrgs\") pod \"cert-manager-858654f9db-s7blt\" (UID: \"1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d\") " pod="cert-manager/cert-manager-858654f9db-s7blt" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.471637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9dv\" (UniqueName: \"kubernetes.io/projected/c952f67d-03e4-4c30-a44b-884f26d81c4e-kube-api-access-9n9dv\") pod \"cert-manager-cainjector-cf98fcc89-9vzgv\" (UID: \"c952f67d-03e4-4c30-a44b-884f26d81c4e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.572586 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fvzq\" (UniqueName: \"kubernetes.io/projected/8278ba6d-7719-4b12-9f80-29867e6fc2ba-kube-api-access-6fvzq\") pod \"cert-manager-webhook-687f57d79b-xlw9r\" (UID: \"8278ba6d-7719-4b12-9f80-29867e6fc2ba\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.572628 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9dv\" (UniqueName: \"kubernetes.io/projected/c952f67d-03e4-4c30-a44b-884f26d81c4e-kube-api-access-9n9dv\") pod \"cert-manager-cainjector-cf98fcc89-9vzgv\" (UID: \"c952f67d-03e4-4c30-a44b-884f26d81c4e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.572693 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brrgs\" (UniqueName: \"kubernetes.io/projected/1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d-kube-api-access-brrgs\") pod \"cert-manager-858654f9db-s7blt\" (UID: \"1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d\") " pod="cert-manager/cert-manager-858654f9db-s7blt" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.590825 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brrgs\" (UniqueName: \"kubernetes.io/projected/1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d-kube-api-access-brrgs\") pod \"cert-manager-858654f9db-s7blt\" (UID: \"1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d\") " pod="cert-manager/cert-manager-858654f9db-s7blt" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.590881 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9dv\" (UniqueName: \"kubernetes.io/projected/c952f67d-03e4-4c30-a44b-884f26d81c4e-kube-api-access-9n9dv\") pod \"cert-manager-cainjector-cf98fcc89-9vzgv\" (UID: \"c952f67d-03e4-4c30-a44b-884f26d81c4e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.657567 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.671622 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-s7blt" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.674251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fvzq\" (UniqueName: \"kubernetes.io/projected/8278ba6d-7719-4b12-9f80-29867e6fc2ba-kube-api-access-6fvzq\") pod \"cert-manager-webhook-687f57d79b-xlw9r\" (UID: \"8278ba6d-7719-4b12-9f80-29867e6fc2ba\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.693905 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fvzq\" (UniqueName: \"kubernetes.io/projected/8278ba6d-7719-4b12-9f80-29867e6fc2ba-kube-api-access-6fvzq\") pod \"cert-manager-webhook-687f57d79b-xlw9r\" (UID: \"8278ba6d-7719-4b12-9f80-29867e6fc2ba\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.882236 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-s7blt"] Mar 11 09:26:48 crc kubenswrapper[4830]: W0311 09:26:48.887485 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ef3b4a5_606d_4ed7_ba6f_be2095e5d84d.slice/crio-eccb5bdfba8b9e1b5972bffc50f7a1e241cbd00d8552d1e47e6723f6c27d2517 WatchSource:0}: Error finding container eccb5bdfba8b9e1b5972bffc50f7a1e241cbd00d8552d1e47e6723f6c27d2517: Status 404 returned error can't find the container with id eccb5bdfba8b9e1b5972bffc50f7a1e241cbd00d8552d1e47e6723f6c27d2517 Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.918741 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv"] Mar 11 09:26:48 crc kubenswrapper[4830]: W0311 09:26:48.921607 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc952f67d_03e4_4c30_a44b_884f26d81c4e.slice/crio-9eadde6f81bd2e54221d532aa43d1ec31b3caef4f772bffc2e83b43d378dcfec WatchSource:0}: Error finding container 9eadde6f81bd2e54221d532aa43d1ec31b3caef4f772bffc2e83b43d378dcfec: Status 404 returned error can't find the container with id 9eadde6f81bd2e54221d532aa43d1ec31b3caef4f772bffc2e83b43d378dcfec Mar 11 09:26:48 crc kubenswrapper[4830]: I0311 09:26:48.991768 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" Mar 11 09:26:49 crc kubenswrapper[4830]: I0311 09:26:49.167351 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xlw9r"] Mar 11 09:26:49 crc kubenswrapper[4830]: W0311 09:26:49.175024 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8278ba6d_7719_4b12_9f80_29867e6fc2ba.slice/crio-f470fb292f6c12e7c579520dc326fd46368e1fe06f86617c180fca2820372758 WatchSource:0}: Error finding container f470fb292f6c12e7c579520dc326fd46368e1fe06f86617c180fca2820372758: Status 404 returned error can't find the container with id f470fb292f6c12e7c579520dc326fd46368e1fe06f86617c180fca2820372758 Mar 11 09:26:49 crc kubenswrapper[4830]: I0311 09:26:49.368779 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-s7blt" event={"ID":"1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d","Type":"ContainerStarted","Data":"eccb5bdfba8b9e1b5972bffc50f7a1e241cbd00d8552d1e47e6723f6c27d2517"} Mar 11 09:26:49 crc kubenswrapper[4830]: I0311 09:26:49.370775 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" event={"ID":"8278ba6d-7719-4b12-9f80-29867e6fc2ba","Type":"ContainerStarted","Data":"f470fb292f6c12e7c579520dc326fd46368e1fe06f86617c180fca2820372758"} Mar 11 09:26:49 crc kubenswrapper[4830]: I0311 09:26:49.371858 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" event={"ID":"c952f67d-03e4-4c30-a44b-884f26d81c4e","Type":"ContainerStarted","Data":"9eadde6f81bd2e54221d532aa43d1ec31b3caef4f772bffc2e83b43d378dcfec"} Mar 11 09:26:53 crc kubenswrapper[4830]: I0311 09:26:53.643831 4830 scope.go:117] "RemoveContainer" containerID="9d0b0b705c635201e241b6ade6a3bd1f15fd636e2dd493fa96ff88d6f43eb60d" Mar 11 09:26:54 crc kubenswrapper[4830]: I0311 09:26:54.398112 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-s7blt" event={"ID":"1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d","Type":"ContainerStarted","Data":"d1b17749eb086c88ece6c8cf02b25836dae3951d0b80716283abd48cc740136d"} Mar 11 09:26:54 crc kubenswrapper[4830]: I0311 09:26:54.399827 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" event={"ID":"8278ba6d-7719-4b12-9f80-29867e6fc2ba","Type":"ContainerStarted","Data":"df6894de2c757370966575f8055919bd6d0608a4e4807316b2086efee816f0d1"} Mar 11 09:26:54 crc kubenswrapper[4830]: I0311 09:26:54.400052 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" Mar 11 09:26:54 crc kubenswrapper[4830]: I0311 09:26:54.401183 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" event={"ID":"c952f67d-03e4-4c30-a44b-884f26d81c4e","Type":"ContainerStarted","Data":"b66ab8a07a742443c1ba1e0c770b99fffcd25aec8913b4826eca64382810445e"} Mar 11 09:26:54 crc kubenswrapper[4830]: I0311 09:26:54.416959 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-s7blt" podStartSLOduration=1.252306633 podStartE2EDuration="6.416936264s" podCreationTimestamp="2026-03-11 09:26:48 +0000 UTC" firstStartedPulling="2026-03-11 09:26:48.889496658 +0000 UTC m=+776.670647347" lastFinishedPulling="2026-03-11 09:26:54.054126289 +0000 UTC m=+781.835276978" observedRunningTime="2026-03-11 09:26:54.412824088 +0000 UTC m=+782.193974797" watchObservedRunningTime="2026-03-11 09:26:54.416936264 +0000 UTC m=+782.198086953" Mar 11 09:26:54 crc kubenswrapper[4830]: I0311 09:26:54.461919 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-9vzgv" podStartSLOduration=1.3310499949999999 podStartE2EDuration="6.461900683s" podCreationTimestamp="2026-03-11 09:26:48 +0000 UTC" firstStartedPulling="2026-03-11 09:26:48.923277451 +0000 UTC m=+776.704428140" lastFinishedPulling="2026-03-11 09:26:54.054128139 +0000 UTC m=+781.835278828" observedRunningTime="2026-03-11 09:26:54.443080841 +0000 UTC m=+782.224231540" watchObservedRunningTime="2026-03-11 09:26:54.461900683 +0000 UTC m=+782.243051372" Mar 11 09:26:54 crc kubenswrapper[4830]: I0311 09:26:54.462613 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" podStartSLOduration=1.530611034 podStartE2EDuration="6.462606352s" podCreationTimestamp="2026-03-11 09:26:48 +0000 UTC" firstStartedPulling="2026-03-11 09:26:49.177482352 +0000 UTC m=+776.958633041" lastFinishedPulling="2026-03-11 09:26:54.10947767 +0000 UTC m=+781.890628359" observedRunningTime="2026-03-11 09:26:54.45968371 +0000 UTC m=+782.240834399" watchObservedRunningTime="2026-03-11 09:26:54.462606352 +0000 UTC m=+782.243757041" Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.689852 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gtl5j"] Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.690819 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-controller" containerID="cri-o://f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" gracePeriod=30 Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.691236 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="sbdb" containerID="cri-o://628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" gracePeriod=30 Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.691282 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="nbdb" containerID="cri-o://cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" gracePeriod=30 Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.691325 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="northd" containerID="cri-o://709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" gracePeriod=30 Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.691359 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" gracePeriod=30 Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.691394 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-node" containerID="cri-o://aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" gracePeriod=30 Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.691426 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-acl-logging" containerID="cri-o://c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" gracePeriod=30 Mar 11 09:27:02 crc kubenswrapper[4830]: I0311 09:27:02.721695 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" containerID="cri-o://d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" gracePeriod=30 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.050091 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/3.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.052776 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovn-acl-logging/0.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.053443 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovn-controller/0.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.053918 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118579 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6c46p"] Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118834 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-acl-logging" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118854 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-acl-logging" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118867 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="northd" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118875 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="northd" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118894 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kubecfg-setup" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118907 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kubecfg-setup" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118917 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118926 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118940 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118948 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118958 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-node" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118966 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-node" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118976 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.118986 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.118999 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119008 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.119069 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="nbdb" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119078 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="nbdb" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.119092 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="sbdb" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119101 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="sbdb" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.119109 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119116 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119230 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="nbdb" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119240 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="sbdb" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119250 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119260 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-ovn-metrics" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119270 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="kube-rbac-proxy-node" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119283 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119292 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119304 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovn-acl-logging" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119316 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119325 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119335 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="northd" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.119444 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119453 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119576 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.119713 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.119722 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerName="ovnkube-controller" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.121616 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137348 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqfbx\" (UniqueName: \"kubernetes.io/projected/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-kube-api-access-wqfbx\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137412 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-systemd\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137458 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-ovn-kubernetes\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137495 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-var-lib-openvswitch\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137527 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-kubelet\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137557 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-openvswitch\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137673 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137724 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137600 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137756 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137785 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137815 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-node-log\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137859 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-netd\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137891 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-config\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137904 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-node-log" (OuterVolumeSpecName: "node-log") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137951 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.137974 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-ovn\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138007 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-slash\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138047 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-systemd-units\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138068 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-bin\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138094 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-script-lib\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138095 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138115 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-env-overrides\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138138 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovn-node-metrics-cert\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138147 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138161 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-etc-openvswitch\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138182 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-netns\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138188 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-slash" (OuterVolumeSpecName: "host-slash") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138198 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-log-socket\") pod \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\" (UID: \"13b9ac6c-3f4b-4dd4-b91d-7173880939d8\") " Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138342 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138370 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138573 4830 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138589 4830 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-slash\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138601 4830 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138615 4830 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138626 4830 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138637 4830 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138648 4830 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138660 4830 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138671 4830 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-node-log\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138685 4830 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138696 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138721 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138747 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138756 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.138777 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-log-socket" (OuterVolumeSpecName: "log-socket") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.139277 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.139392 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.145340 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.145946 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-kube-api-access-wqfbx" (OuterVolumeSpecName: "kube-api-access-wqfbx") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "kube-api-access-wqfbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.151886 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "13b9ac6c-3f4b-4dd4-b91d-7173880939d8" (UID: "13b9ac6c-3f4b-4dd4-b91d-7173880939d8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.240796 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb9bz\" (UniqueName: \"kubernetes.io/projected/184939bf-cea8-490d-868e-3d43fe7b24be-kube-api-access-gb9bz\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.240888 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/184939bf-cea8-490d-868e-3d43fe7b24be-ovn-node-metrics-cert\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.240965 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-systemd-units\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241066 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241152 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-systemd\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-env-overrides\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241226 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-slash\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241352 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-etc-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241503 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-ovnkube-script-lib\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241605 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241670 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-run-netns\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241746 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-ovnkube-config\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241822 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-log-socket\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241886 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241930 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-ovn\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241951 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-cni-netd\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.241994 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-kubelet\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242057 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-cni-bin\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242096 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-node-log\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242165 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-var-lib-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242275 4830 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242304 4830 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242324 4830 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242340 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242358 4830 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242375 4830 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-log-socket\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242393 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqfbx\" (UniqueName: \"kubernetes.io/projected/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-kube-api-access-wqfbx\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242411 4830 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.242428 4830 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13b9ac6c-3f4b-4dd4-b91d-7173880939d8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.343863 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-ovnkube-config\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344231 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-log-socket\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344259 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344286 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-ovn\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344308 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-cni-netd\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-kubelet\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344355 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-cni-bin\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344355 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-log-socket\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344383 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-ovn\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-node-log\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344419 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-cni-bin\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344443 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-node-log\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344449 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-kubelet\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344482 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-cni-netd\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344488 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-var-lib-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344524 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-var-lib-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344595 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb9bz\" (UniqueName: \"kubernetes.io/projected/184939bf-cea8-490d-868e-3d43fe7b24be-kube-api-access-gb9bz\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344641 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/184939bf-cea8-490d-868e-3d43fe7b24be-ovn-node-metrics-cert\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344702 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-systemd-units\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344732 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344795 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-env-overrides\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344817 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-ovnkube-config\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344856 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-systemd\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344824 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-systemd\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344894 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-systemd-units\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344923 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-slash\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.344982 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-etc-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.345088 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-ovnkube-script-lib\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.345152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.345211 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-run-netns\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.345315 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-run-netns\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.345365 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-run-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.346148 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-env-overrides\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.346180 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-etc-openvswitch\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.346265 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-slash\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.346387 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/184939bf-cea8-490d-868e-3d43fe7b24be-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.347063 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/184939bf-cea8-490d-868e-3d43fe7b24be-ovnkube-script-lib\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.351493 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/184939bf-cea8-490d-868e-3d43fe7b24be-ovn-node-metrics-cert\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.372455 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb9bz\" (UniqueName: \"kubernetes.io/projected/184939bf-cea8-490d-868e-3d43fe7b24be-kube-api-access-gb9bz\") pod \"ovnkube-node-6c46p\" (UID: \"184939bf-cea8-490d-868e-3d43fe7b24be\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.438138 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.462542 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovnkube-controller/3.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.466035 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovn-acl-logging/0.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.466744 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gtl5j_13b9ac6c-3f4b-4dd4-b91d-7173880939d8/ovn-controller/0.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467720 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467742 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467796 4830 scope.go:117] "RemoveContainer" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467875 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" exitCode=0 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467901 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" exitCode=0 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467912 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" exitCode=0 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467924 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" exitCode=0 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467932 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" exitCode=0 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467941 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" exitCode=0 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467950 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" exitCode=143 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.467958 4830 generic.go:334] "Generic (PLEG): container finished" podID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" containerID="f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" exitCode=143 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468029 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468050 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468066 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468077 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468091 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468103 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468116 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468123 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468130 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468138 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468144 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468151 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468157 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468164 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468173 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468214 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.468223 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.471749 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.472070 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.472093 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.472107 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.473797 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.473854 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.473869 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.473883 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474617 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474686 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474703 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474720 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474735 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474751 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474838 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474854 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474865 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474877 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474901 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474921 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gtl5j" event={"ID":"13b9ac6c-3f4b-4dd4-b91d-7173880939d8","Type":"ContainerDied","Data":"e56274693a8f57aeb7ae06293937d218211f4df2f998847c632307b0678b6389"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474944 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474958 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474969 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474982 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.474994 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.475005 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.475045 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.475057 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.475068 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.475079 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.484928 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/2.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.485879 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/1.log" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.485975 4830 generic.go:334] "Generic (PLEG): container finished" podID="75fdb109-77cf-4d97-ac3c-6f3139b3bb7a" containerID="81a0883de2371cc2d74c6c6fe6667f4a6bc5042033ad0fe8911ac16bd14906ac" exitCode=2 Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.486090 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerDied","Data":"81a0883de2371cc2d74c6c6fe6667f4a6bc5042033ad0fe8911ac16bd14906ac"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.486179 4830 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223"} Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.487006 4830 scope.go:117] "RemoveContainer" containerID="81a0883de2371cc2d74c6c6fe6667f4a6bc5042033ad0fe8911ac16bd14906ac" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.491740 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8w98l_openshift-multus(75fdb109-77cf-4d97-ac3c-6f3139b3bb7a)\"" pod="openshift-multus/multus-8w98l" podUID="75fdb109-77cf-4d97-ac3c-6f3139b3bb7a" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.514819 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.545897 4830 scope.go:117] "RemoveContainer" containerID="628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.556191 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gtl5j"] Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.558910 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gtl5j"] Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.564140 4830 scope.go:117] "RemoveContainer" containerID="cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.577975 4830 scope.go:117] "RemoveContainer" containerID="709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.594996 4830 scope.go:117] "RemoveContainer" containerID="55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.646200 4830 scope.go:117] "RemoveContainer" containerID="aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.658396 4830 scope.go:117] "RemoveContainer" containerID="c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.670269 4830 scope.go:117] "RemoveContainer" containerID="f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.687258 4830 scope.go:117] "RemoveContainer" containerID="6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.704303 4830 scope.go:117] "RemoveContainer" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.704785 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": container with ID starting with d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b not found: ID does not exist" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.704835 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} err="failed to get container status \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": rpc error: code = NotFound desc = could not find container \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": container with ID starting with d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.704870 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.705142 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": container with ID starting with 0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0 not found: ID does not exist" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.705164 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} err="failed to get container status \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": rpc error: code = NotFound desc = could not find container \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": container with ID starting with 0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.705180 4830 scope.go:117] "RemoveContainer" containerID="628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.705405 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": container with ID starting with 628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9 not found: ID does not exist" containerID="628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.705423 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} err="failed to get container status \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": rpc error: code = NotFound desc = could not find container \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": container with ID starting with 628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.705439 4830 scope.go:117] "RemoveContainer" containerID="cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.705685 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": container with ID starting with cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129 not found: ID does not exist" containerID="cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.705707 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} err="failed to get container status \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": rpc error: code = NotFound desc = could not find container \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": container with ID starting with cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.705722 4830 scope.go:117] "RemoveContainer" containerID="709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.706038 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": container with ID starting with 709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35 not found: ID does not exist" containerID="709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.706112 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} err="failed to get container status \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": rpc error: code = NotFound desc = could not find container \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": container with ID starting with 709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.706142 4830 scope.go:117] "RemoveContainer" containerID="55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.706437 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": container with ID starting with 55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b not found: ID does not exist" containerID="55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.706470 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} err="failed to get container status \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": rpc error: code = NotFound desc = could not find container \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": container with ID starting with 55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.706489 4830 scope.go:117] "RemoveContainer" containerID="aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.706854 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": container with ID starting with aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e not found: ID does not exist" containerID="aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.706902 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} err="failed to get container status \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": rpc error: code = NotFound desc = could not find container \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": container with ID starting with aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.706931 4830 scope.go:117] "RemoveContainer" containerID="c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.707282 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": container with ID starting with c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64 not found: ID does not exist" containerID="c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.707311 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} err="failed to get container status \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": rpc error: code = NotFound desc = could not find container \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": container with ID starting with c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.707330 4830 scope.go:117] "RemoveContainer" containerID="f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.707550 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": container with ID starting with f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834 not found: ID does not exist" containerID="f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.707573 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} err="failed to get container status \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": rpc error: code = NotFound desc = could not find container \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": container with ID starting with f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.707588 4830 scope.go:117] "RemoveContainer" containerID="6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0" Mar 11 09:27:03 crc kubenswrapper[4830]: E0311 09:27:03.707804 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": container with ID starting with 6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0 not found: ID does not exist" containerID="6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.707836 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} err="failed to get container status \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": rpc error: code = NotFound desc = could not find container \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": container with ID starting with 6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.707856 4830 scope.go:117] "RemoveContainer" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.708155 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} err="failed to get container status \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": rpc error: code = NotFound desc = could not find container \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": container with ID starting with d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.708182 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.708427 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} err="failed to get container status \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": rpc error: code = NotFound desc = could not find container \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": container with ID starting with 0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.708448 4830 scope.go:117] "RemoveContainer" containerID="628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.708731 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} err="failed to get container status \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": rpc error: code = NotFound desc = could not find container \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": container with ID starting with 628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.708762 4830 scope.go:117] "RemoveContainer" containerID="cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.709096 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} err="failed to get container status \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": rpc error: code = NotFound desc = could not find container \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": container with ID starting with cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.709116 4830 scope.go:117] "RemoveContainer" containerID="709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.709372 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} err="failed to get container status \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": rpc error: code = NotFound desc = could not find container \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": container with ID starting with 709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.709389 4830 scope.go:117] "RemoveContainer" containerID="55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.709634 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} err="failed to get container status \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": rpc error: code = NotFound desc = could not find container \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": container with ID starting with 55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.709651 4830 scope.go:117] "RemoveContainer" containerID="aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.710104 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} err="failed to get container status \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": rpc error: code = NotFound desc = could not find container \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": container with ID starting with aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.710121 4830 scope.go:117] "RemoveContainer" containerID="c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.710337 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} err="failed to get container status \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": rpc error: code = NotFound desc = could not find container \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": container with ID starting with c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.710373 4830 scope.go:117] "RemoveContainer" containerID="f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.710686 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} err="failed to get container status \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": rpc error: code = NotFound desc = could not find container \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": container with ID starting with f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.710716 4830 scope.go:117] "RemoveContainer" containerID="6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.711002 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} err="failed to get container status \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": rpc error: code = NotFound desc = could not find container \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": container with ID starting with 6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.711063 4830 scope.go:117] "RemoveContainer" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.711358 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} err="failed to get container status \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": rpc error: code = NotFound desc = could not find container \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": container with ID starting with d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.711384 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.711758 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} err="failed to get container status \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": rpc error: code = NotFound desc = could not find container \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": container with ID starting with 0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.711776 4830 scope.go:117] "RemoveContainer" containerID="628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.712067 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} err="failed to get container status \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": rpc error: code = NotFound desc = could not find container \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": container with ID starting with 628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.712099 4830 scope.go:117] "RemoveContainer" containerID="cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.712394 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} err="failed to get container status \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": rpc error: code = NotFound desc = could not find container \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": container with ID starting with cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.712413 4830 scope.go:117] "RemoveContainer" containerID="709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.712701 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} err="failed to get container status \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": rpc error: code = NotFound desc = could not find container \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": container with ID starting with 709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.712731 4830 scope.go:117] "RemoveContainer" containerID="55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.713041 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} err="failed to get container status \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": rpc error: code = NotFound desc = could not find container \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": container with ID starting with 55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.713069 4830 scope.go:117] "RemoveContainer" containerID="aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.715054 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} err="failed to get container status \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": rpc error: code = NotFound desc = could not find container \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": container with ID starting with aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.715082 4830 scope.go:117] "RemoveContainer" containerID="c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.719211 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} err="failed to get container status \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": rpc error: code = NotFound desc = could not find container \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": container with ID starting with c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.719253 4830 scope.go:117] "RemoveContainer" containerID="f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.719625 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} err="failed to get container status \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": rpc error: code = NotFound desc = could not find container \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": container with ID starting with f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.719650 4830 scope.go:117] "RemoveContainer" containerID="6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.719882 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} err="failed to get container status \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": rpc error: code = NotFound desc = could not find container \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": container with ID starting with 6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.719898 4830 scope.go:117] "RemoveContainer" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.720275 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} err="failed to get container status \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": rpc error: code = NotFound desc = could not find container \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": container with ID starting with d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.720412 4830 scope.go:117] "RemoveContainer" containerID="0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.720724 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0"} err="failed to get container status \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": rpc error: code = NotFound desc = could not find container \"0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0\": container with ID starting with 0da14bb05b953e465af2145681d45accb5a1212410612ab58cd887b9dfd08be0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.720742 4830 scope.go:117] "RemoveContainer" containerID="628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.721071 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9"} err="failed to get container status \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": rpc error: code = NotFound desc = could not find container \"628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9\": container with ID starting with 628021a120089e25e5dad128049e80fce7731d4be27a7d585d207a2b763908c9 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.721135 4830 scope.go:117] "RemoveContainer" containerID="cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.721528 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129"} err="failed to get container status \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": rpc error: code = NotFound desc = could not find container \"cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129\": container with ID starting with cd3e1a866cf7f4f1e3c5f54a2cc18c482ef5784bebfb99d154d4d483a8dd8129 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.721560 4830 scope.go:117] "RemoveContainer" containerID="709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.721783 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35"} err="failed to get container status \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": rpc error: code = NotFound desc = could not find container \"709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35\": container with ID starting with 709c3e92b15e36fed2bb7fd836958f33bb564e7a2356c00b6cb0eb29cd97ed35 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.721807 4830 scope.go:117] "RemoveContainer" containerID="55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.722261 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b"} err="failed to get container status \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": rpc error: code = NotFound desc = could not find container \"55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b\": container with ID starting with 55a615e2dd334759b2c5148745c162196519a61c63fca3f5be9ac9bf01deef6b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.722291 4830 scope.go:117] "RemoveContainer" containerID="aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.722661 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e"} err="failed to get container status \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": rpc error: code = NotFound desc = could not find container \"aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e\": container with ID starting with aacbce94a87389ba0298134c5a8d120746494f519ead94a562fb922f45c0715e not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.722755 4830 scope.go:117] "RemoveContainer" containerID="c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.723353 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64"} err="failed to get container status \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": rpc error: code = NotFound desc = could not find container \"c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64\": container with ID starting with c3def28fa9f6ce2ad9d8435a8d2dd8f07e80fca9e6f5db9063c2212af9ec2f64 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.723382 4830 scope.go:117] "RemoveContainer" containerID="f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.723786 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834"} err="failed to get container status \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": rpc error: code = NotFound desc = could not find container \"f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834\": container with ID starting with f30c532e10850b31b9b406114d83d9ef6ca77a72dfb1ceaa3d48d28cdeff9834 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.723842 4830 scope.go:117] "RemoveContainer" containerID="6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.724209 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0"} err="failed to get container status \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": rpc error: code = NotFound desc = could not find container \"6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0\": container with ID starting with 6824d7f30f42fd27c6e1d971927596177d92fa5db1172c56ff1d2956bd7e3cb0 not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.724231 4830 scope.go:117] "RemoveContainer" containerID="d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.724749 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b"} err="failed to get container status \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": rpc error: code = NotFound desc = could not find container \"d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b\": container with ID starting with d938962540ed59fa84759f741f40765ef4e7491af456be61a60073cd67ad002b not found: ID does not exist" Mar 11 09:27:03 crc kubenswrapper[4830]: I0311 09:27:03.995755 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-xlw9r" Mar 11 09:27:04 crc kubenswrapper[4830]: I0311 09:27:04.501791 4830 generic.go:334] "Generic (PLEG): container finished" podID="184939bf-cea8-490d-868e-3d43fe7b24be" containerID="98def682187d1142dbea74d2e670bd2d544b0e4660b25c1d8dbd6aabc9b0298d" exitCode=0 Mar 11 09:27:04 crc kubenswrapper[4830]: I0311 09:27:04.501838 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerDied","Data":"98def682187d1142dbea74d2e670bd2d544b0e4660b25c1d8dbd6aabc9b0298d"} Mar 11 09:27:04 crc kubenswrapper[4830]: I0311 09:27:04.501870 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"a59e5a3f9723feb71652745bc88f6835b5734ca98ab3ca3555d800492b6af845"} Mar 11 09:27:04 crc kubenswrapper[4830]: I0311 09:27:04.948821 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b9ac6c-3f4b-4dd4-b91d-7173880939d8" path="/var/lib/kubelet/pods/13b9ac6c-3f4b-4dd4-b91d-7173880939d8/volumes" Mar 11 09:27:05 crc kubenswrapper[4830]: I0311 09:27:05.514173 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"6ec40f7d05f06bbead89e49c60f555822a47e1cac891f29e55874fe7910f9c3f"} Mar 11 09:27:05 crc kubenswrapper[4830]: I0311 09:27:05.514548 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"648bf01975630e624be9caab2b27ef85fb3002578c44a94cb2596c58abe39995"} Mar 11 09:27:05 crc kubenswrapper[4830]: I0311 09:27:05.514571 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"11fe21bbc92f0afe994bb3ef4493845ce383fe4512882f1c834ea2e89ee69ae2"} Mar 11 09:27:05 crc kubenswrapper[4830]: I0311 09:27:05.514592 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"da7c89783dc2f376f3c16f052435f27df5c76057d65691d42ac589a412791caa"} Mar 11 09:27:05 crc kubenswrapper[4830]: I0311 09:27:05.514608 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"07137e3c17f5fc1b4b03eb13f82164fd64e57d346e9d5956674286b039e69f0c"} Mar 11 09:27:05 crc kubenswrapper[4830]: I0311 09:27:05.514625 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"29b091ac881b9d9196e572243964463f3243df85644228fdefb2b2fbe9c1b72a"} Mar 11 09:27:08 crc kubenswrapper[4830]: I0311 09:27:08.534596 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"618e20c1a184c0a9f8b767f9d5ab699da379319330674ff6be9202481f1cd695"} Mar 11 09:27:10 crc kubenswrapper[4830]: I0311 09:27:10.548243 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" event={"ID":"184939bf-cea8-490d-868e-3d43fe7b24be","Type":"ContainerStarted","Data":"e06d5de02d9edfe8517288d688227a141cc77b0e134106971fafefb533f0611a"} Mar 11 09:27:10 crc kubenswrapper[4830]: I0311 09:27:10.548613 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:10 crc kubenswrapper[4830]: I0311 09:27:10.548637 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:10 crc kubenswrapper[4830]: I0311 09:27:10.548652 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:10 crc kubenswrapper[4830]: I0311 09:27:10.579728 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" podStartSLOduration=7.579704596 podStartE2EDuration="7.579704596s" podCreationTimestamp="2026-03-11 09:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:27:10.575891959 +0000 UTC m=+798.357042668" watchObservedRunningTime="2026-03-11 09:27:10.579704596 +0000 UTC m=+798.360855285" Mar 11 09:27:10 crc kubenswrapper[4830]: I0311 09:27:10.582830 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:10 crc kubenswrapper[4830]: I0311 09:27:10.583978 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:13 crc kubenswrapper[4830]: I0311 09:27:13.060415 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:27:13 crc kubenswrapper[4830]: I0311 09:27:13.061240 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:27:16 crc kubenswrapper[4830]: I0311 09:27:16.933148 4830 scope.go:117] "RemoveContainer" containerID="81a0883de2371cc2d74c6c6fe6667f4a6bc5042033ad0fe8911ac16bd14906ac" Mar 11 09:27:16 crc kubenswrapper[4830]: E0311 09:27:16.933621 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8w98l_openshift-multus(75fdb109-77cf-4d97-ac3c-6f3139b3bb7a)\"" pod="openshift-multus/multus-8w98l" podUID="75fdb109-77cf-4d97-ac3c-6f3139b3bb7a" Mar 11 09:27:28 crc kubenswrapper[4830]: I0311 09:27:28.932619 4830 scope.go:117] "RemoveContainer" containerID="81a0883de2371cc2d74c6c6fe6667f4a6bc5042033ad0fe8911ac16bd14906ac" Mar 11 09:27:29 crc kubenswrapper[4830]: I0311 09:27:29.670877 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/2.log" Mar 11 09:27:29 crc kubenswrapper[4830]: I0311 09:27:29.671662 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/1.log" Mar 11 09:27:29 crc kubenswrapper[4830]: I0311 09:27:29.671727 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8w98l" event={"ID":"75fdb109-77cf-4d97-ac3c-6f3139b3bb7a","Type":"ContainerStarted","Data":"f9441d516411079964391734c4a2147ee2d5457e53f9ae6728818b61c3812bb0"} Mar 11 09:27:33 crc kubenswrapper[4830]: I0311 09:27:33.471779 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c46p" Mar 11 09:27:43 crc kubenswrapper[4830]: I0311 09:27:43.060753 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:27:43 crc kubenswrapper[4830]: I0311 09:27:43.061439 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.053193 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8"] Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.056294 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.059451 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.067992 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8"] Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.157240 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.157292 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjwph\" (UniqueName: \"kubernetes.io/projected/9c57794f-6ae4-4350-9a03-efe9a10f5d47-kube-api-access-qjwph\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.157332 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.258206 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.258259 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjwph\" (UniqueName: \"kubernetes.io/projected/9c57794f-6ae4-4350-9a03-efe9a10f5d47-kube-api-access-qjwph\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.258290 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.258987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.259184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.282969 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjwph\" (UniqueName: \"kubernetes.io/projected/9c57794f-6ae4-4350-9a03-efe9a10f5d47-kube-api-access-qjwph\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.383460 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.616510 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8"] Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.770302 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" event={"ID":"9c57794f-6ae4-4350-9a03-efe9a10f5d47","Type":"ContainerStarted","Data":"8441cc8b9444c37be508f78d4d0e4e15232a157f57ee918eb089d58fed353b0f"} Mar 11 09:27:46 crc kubenswrapper[4830]: I0311 09:27:46.770575 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" event={"ID":"9c57794f-6ae4-4350-9a03-efe9a10f5d47","Type":"ContainerStarted","Data":"b73cef09eb57bf40c56ac19a8d8ffbc99379e5243626cb958ab6d5a175e9a856"} Mar 11 09:27:47 crc kubenswrapper[4830]: I0311 09:27:47.779823 4830 generic.go:334] "Generic (PLEG): container finished" podID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerID="8441cc8b9444c37be508f78d4d0e4e15232a157f57ee918eb089d58fed353b0f" exitCode=0 Mar 11 09:27:47 crc kubenswrapper[4830]: I0311 09:27:47.780054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" event={"ID":"9c57794f-6ae4-4350-9a03-efe9a10f5d47","Type":"ContainerDied","Data":"8441cc8b9444c37be508f78d4d0e4e15232a157f57ee918eb089d58fed353b0f"} Mar 11 09:27:49 crc kubenswrapper[4830]: I0311 09:27:49.798365 4830 generic.go:334] "Generic (PLEG): container finished" podID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerID="745df0c10487ec730253f3fc620f98951291263ed2a3b14662e79b78199a18f8" exitCode=0 Mar 11 09:27:49 crc kubenswrapper[4830]: I0311 09:27:49.798511 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" event={"ID":"9c57794f-6ae4-4350-9a03-efe9a10f5d47","Type":"ContainerDied","Data":"745df0c10487ec730253f3fc620f98951291263ed2a3b14662e79b78199a18f8"} Mar 11 09:27:50 crc kubenswrapper[4830]: I0311 09:27:50.810289 4830 generic.go:334] "Generic (PLEG): container finished" podID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerID="861c7b58ef45666d38508f6a294dfa2f8f6ed1d3fda1afec2e5b8918f8a6f9c6" exitCode=0 Mar 11 09:27:50 crc kubenswrapper[4830]: I0311 09:27:50.810330 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" event={"ID":"9c57794f-6ae4-4350-9a03-efe9a10f5d47","Type":"ContainerDied","Data":"861c7b58ef45666d38508f6a294dfa2f8f6ed1d3fda1afec2e5b8918f8a6f9c6"} Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.150736 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.240361 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-bundle\") pod \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.240438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-util\") pod \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.240495 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjwph\" (UniqueName: \"kubernetes.io/projected/9c57794f-6ae4-4350-9a03-efe9a10f5d47-kube-api-access-qjwph\") pod \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\" (UID: \"9c57794f-6ae4-4350-9a03-efe9a10f5d47\") " Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.241065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-bundle" (OuterVolumeSpecName: "bundle") pod "9c57794f-6ae4-4350-9a03-efe9a10f5d47" (UID: "9c57794f-6ae4-4350-9a03-efe9a10f5d47"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.247216 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c57794f-6ae4-4350-9a03-efe9a10f5d47-kube-api-access-qjwph" (OuterVolumeSpecName: "kube-api-access-qjwph") pod "9c57794f-6ae4-4350-9a03-efe9a10f5d47" (UID: "9c57794f-6ae4-4350-9a03-efe9a10f5d47"). InnerVolumeSpecName "kube-api-access-qjwph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.342360 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjwph\" (UniqueName: \"kubernetes.io/projected/9c57794f-6ae4-4350-9a03-efe9a10f5d47-kube-api-access-qjwph\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.342404 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.490355 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-util" (OuterVolumeSpecName: "util") pod "9c57794f-6ae4-4350-9a03-efe9a10f5d47" (UID: "9c57794f-6ae4-4350-9a03-efe9a10f5d47"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.546266 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9c57794f-6ae4-4350-9a03-efe9a10f5d47-util\") on node \"crc\" DevicePath \"\"" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.833389 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" event={"ID":"9c57794f-6ae4-4350-9a03-efe9a10f5d47","Type":"ContainerDied","Data":"b73cef09eb57bf40c56ac19a8d8ffbc99379e5243626cb958ab6d5a175e9a856"} Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.833833 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b73cef09eb57bf40c56ac19a8d8ffbc99379e5243626cb958ab6d5a175e9a856" Mar 11 09:27:52 crc kubenswrapper[4830]: I0311 09:27:52.833492 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8" Mar 11 09:27:53 crc kubenswrapper[4830]: I0311 09:27:53.995276 4830 scope.go:117] "RemoveContainer" containerID="f3a442651aaa10001b0f1ecaabf1c544993cc31fefb11306183c4742763b5223" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.846232 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8w98l_75fdb109-77cf-4d97-ac3c-6f3139b3bb7a/kube-multus/2.log" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.913581 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6"] Mar 11 09:27:54 crc kubenswrapper[4830]: E0311 09:27:54.913957 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerName="util" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.913986 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerName="util" Mar 11 09:27:54 crc kubenswrapper[4830]: E0311 09:27:54.914007 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerName="pull" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.914062 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerName="pull" Mar 11 09:27:54 crc kubenswrapper[4830]: E0311 09:27:54.914084 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerName="extract" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.914096 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerName="extract" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.914293 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c57794f-6ae4-4350-9a03-efe9a10f5d47" containerName="extract" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.914901 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.919075 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.919191 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-cwzfz" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.919394 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.921647 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6"] Mar 11 09:27:54 crc kubenswrapper[4830]: I0311 09:27:54.987806 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjk8r\" (UniqueName: \"kubernetes.io/projected/b282dd08-59c0-4a26-a7a0-e165dfc899b6-kube-api-access-kjk8r\") pod \"nmstate-operator-796d4cfff4-zqpc6\" (UID: \"b282dd08-59c0-4a26-a7a0-e165dfc899b6\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" Mar 11 09:27:55 crc kubenswrapper[4830]: I0311 09:27:55.088761 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjk8r\" (UniqueName: \"kubernetes.io/projected/b282dd08-59c0-4a26-a7a0-e165dfc899b6-kube-api-access-kjk8r\") pod \"nmstate-operator-796d4cfff4-zqpc6\" (UID: \"b282dd08-59c0-4a26-a7a0-e165dfc899b6\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" Mar 11 09:27:55 crc kubenswrapper[4830]: I0311 09:27:55.114460 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjk8r\" (UniqueName: \"kubernetes.io/projected/b282dd08-59c0-4a26-a7a0-e165dfc899b6-kube-api-access-kjk8r\") pod \"nmstate-operator-796d4cfff4-zqpc6\" (UID: \"b282dd08-59c0-4a26-a7a0-e165dfc899b6\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" Mar 11 09:27:55 crc kubenswrapper[4830]: I0311 09:27:55.228311 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" Mar 11 09:27:55 crc kubenswrapper[4830]: I0311 09:27:55.629946 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6"] Mar 11 09:27:55 crc kubenswrapper[4830]: I0311 09:27:55.854331 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" event={"ID":"b282dd08-59c0-4a26-a7a0-e165dfc899b6","Type":"ContainerStarted","Data":"f3782221bf44021f2273a7cda343940360f6f025f57b74026da2e8183094840a"} Mar 11 09:27:59 crc kubenswrapper[4830]: I0311 09:27:59.000686 4830 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 11 09:27:59 crc kubenswrapper[4830]: I0311 09:27:59.893739 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" event={"ID":"b282dd08-59c0-4a26-a7a0-e165dfc899b6","Type":"ContainerStarted","Data":"0c7655b46c1c21b2616dfd1dd59a2eb73e054da24bcc8cebc3ecdc96dfc6a2cf"} Mar 11 09:27:59 crc kubenswrapper[4830]: I0311 09:27:59.914277 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-zqpc6" podStartSLOduration=2.640260079 podStartE2EDuration="5.914253062s" podCreationTimestamp="2026-03-11 09:27:54 +0000 UTC" firstStartedPulling="2026-03-11 09:27:55.635773473 +0000 UTC m=+843.416924162" lastFinishedPulling="2026-03-11 09:27:58.909766456 +0000 UTC m=+846.690917145" observedRunningTime="2026-03-11 09:27:59.907179264 +0000 UTC m=+847.688329963" watchObservedRunningTime="2026-03-11 09:27:59.914253062 +0000 UTC m=+847.695403741" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.131408 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553688-6lxbc"] Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.132422 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.134616 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.134756 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.134828 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.143760 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553688-6lxbc"] Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.162703 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp5s4\" (UniqueName: \"kubernetes.io/projected/e1f87712-c9d4-4576-b590-95f554829b4a-kube-api-access-dp5s4\") pod \"auto-csr-approver-29553688-6lxbc\" (UID: \"e1f87712-c9d4-4576-b590-95f554829b4a\") " pod="openshift-infra/auto-csr-approver-29553688-6lxbc" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.264420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp5s4\" (UniqueName: \"kubernetes.io/projected/e1f87712-c9d4-4576-b590-95f554829b4a-kube-api-access-dp5s4\") pod \"auto-csr-approver-29553688-6lxbc\" (UID: \"e1f87712-c9d4-4576-b590-95f554829b4a\") " pod="openshift-infra/auto-csr-approver-29553688-6lxbc" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.290062 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp5s4\" (UniqueName: \"kubernetes.io/projected/e1f87712-c9d4-4576-b590-95f554829b4a-kube-api-access-dp5s4\") pod \"auto-csr-approver-29553688-6lxbc\" (UID: \"e1f87712-c9d4-4576-b590-95f554829b4a\") " pod="openshift-infra/auto-csr-approver-29553688-6lxbc" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.445818 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.719364 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553688-6lxbc"] Mar 11 09:28:00 crc kubenswrapper[4830]: W0311 09:28:00.720313 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1f87712_c9d4_4576_b590_95f554829b4a.slice/crio-329470e7d1659e741be8fd58ec26ac1bd565e3b600b70eaa9abd59ffdacb29c7 WatchSource:0}: Error finding container 329470e7d1659e741be8fd58ec26ac1bd565e3b600b70eaa9abd59ffdacb29c7: Status 404 returned error can't find the container with id 329470e7d1659e741be8fd58ec26ac1bd565e3b600b70eaa9abd59ffdacb29c7 Mar 11 09:28:00 crc kubenswrapper[4830]: I0311 09:28:00.900199 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" event={"ID":"e1f87712-c9d4-4576-b590-95f554829b4a","Type":"ContainerStarted","Data":"329470e7d1659e741be8fd58ec26ac1bd565e3b600b70eaa9abd59ffdacb29c7"} Mar 11 09:28:01 crc kubenswrapper[4830]: I0311 09:28:01.907218 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" event={"ID":"e1f87712-c9d4-4576-b590-95f554829b4a","Type":"ContainerStarted","Data":"63f3033d56819f54f0274d7284469e7705ea7356b5fd01d127ca7a394c42cefc"} Mar 11 09:28:01 crc kubenswrapper[4830]: I0311 09:28:01.921717 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" podStartSLOduration=1.014098555 podStartE2EDuration="1.92169755s" podCreationTimestamp="2026-03-11 09:28:00 +0000 UTC" firstStartedPulling="2026-03-11 09:28:00.721506469 +0000 UTC m=+848.502657158" lastFinishedPulling="2026-03-11 09:28:01.629105464 +0000 UTC m=+849.410256153" observedRunningTime="2026-03-11 09:28:01.918429979 +0000 UTC m=+849.699580668" watchObservedRunningTime="2026-03-11 09:28:01.92169755 +0000 UTC m=+849.702848239" Mar 11 09:28:02 crc kubenswrapper[4830]: I0311 09:28:02.913931 4830 generic.go:334] "Generic (PLEG): container finished" podID="e1f87712-c9d4-4576-b590-95f554829b4a" containerID="63f3033d56819f54f0274d7284469e7705ea7356b5fd01d127ca7a394c42cefc" exitCode=0 Mar 11 09:28:02 crc kubenswrapper[4830]: I0311 09:28:02.914038 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" event={"ID":"e1f87712-c9d4-4576-b590-95f554829b4a","Type":"ContainerDied","Data":"63f3033d56819f54f0274d7284469e7705ea7356b5fd01d127ca7a394c42cefc"} Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.173930 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.227884 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp5s4\" (UniqueName: \"kubernetes.io/projected/e1f87712-c9d4-4576-b590-95f554829b4a-kube-api-access-dp5s4\") pod \"e1f87712-c9d4-4576-b590-95f554829b4a\" (UID: \"e1f87712-c9d4-4576-b590-95f554829b4a\") " Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.233642 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f87712-c9d4-4576-b590-95f554829b4a-kube-api-access-dp5s4" (OuterVolumeSpecName: "kube-api-access-dp5s4") pod "e1f87712-c9d4-4576-b590-95f554829b4a" (UID: "e1f87712-c9d4-4576-b590-95f554829b4a"). InnerVolumeSpecName "kube-api-access-dp5s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.329696 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp5s4\" (UniqueName: \"kubernetes.io/projected/e1f87712-c9d4-4576-b590-95f554829b4a-kube-api-access-dp5s4\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.932540 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.945496 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553688-6lxbc" event={"ID":"e1f87712-c9d4-4576-b590-95f554829b4a","Type":"ContainerDied","Data":"329470e7d1659e741be8fd58ec26ac1bd565e3b600b70eaa9abd59ffdacb29c7"} Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.945562 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="329470e7d1659e741be8fd58ec26ac1bd565e3b600b70eaa9abd59ffdacb29c7" Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.981769 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553682-rnxn9"] Mar 11 09:28:04 crc kubenswrapper[4830]: I0311 09:28:04.984825 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553682-rnxn9"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.163039 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm"] Mar 11 09:28:06 crc kubenswrapper[4830]: E0311 09:28:06.163623 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f87712-c9d4-4576-b590-95f554829b4a" containerName="oc" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.163643 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f87712-c9d4-4576-b590-95f554829b4a" containerName="oc" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.163764 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f87712-c9d4-4576-b590-95f554829b4a" containerName="oc" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.164478 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.168288 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zp27m" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.179056 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.179865 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.182495 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.184775 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.215719 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.244816 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6gnfd"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.245634 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.253989 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2rj\" (UniqueName: \"kubernetes.io/projected/9e81d681-fa0d-4789-8762-ee953dc9f5aa-kube-api-access-lp2rj\") pod \"nmstate-webhook-5f558f5558-vzfnr\" (UID: \"9e81d681-fa0d-4789-8762-ee953dc9f5aa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.254048 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhrz8\" (UniqueName: \"kubernetes.io/projected/20718750-ed46-4785-b2ca-0e41dfd093be-kube-api-access-rhrz8\") pod \"nmstate-metrics-9b8c8685d-qgmxm\" (UID: \"20718750-ed46-4785-b2ca-0e41dfd093be\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.254123 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9e81d681-fa0d-4789-8762-ee953dc9f5aa-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-vzfnr\" (UID: \"9e81d681-fa0d-4789-8762-ee953dc9f5aa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.333109 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.334179 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.337096 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.337416 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.337502 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4565l" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.337635 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.355242 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-ovs-socket\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.355299 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9e81d681-fa0d-4789-8762-ee953dc9f5aa-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-vzfnr\" (UID: \"9e81d681-fa0d-4789-8762-ee953dc9f5aa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.355342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-nmstate-lock\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.355363 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2rj\" (UniqueName: \"kubernetes.io/projected/9e81d681-fa0d-4789-8762-ee953dc9f5aa-kube-api-access-lp2rj\") pod \"nmstate-webhook-5f558f5558-vzfnr\" (UID: \"9e81d681-fa0d-4789-8762-ee953dc9f5aa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.355383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhrz8\" (UniqueName: \"kubernetes.io/projected/20718750-ed46-4785-b2ca-0e41dfd093be-kube-api-access-rhrz8\") pod \"nmstate-metrics-9b8c8685d-qgmxm\" (UID: \"20718750-ed46-4785-b2ca-0e41dfd093be\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.355407 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzmx4\" (UniqueName: \"kubernetes.io/projected/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-kube-api-access-kzmx4\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.355428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-dbus-socket\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.362791 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9e81d681-fa0d-4789-8762-ee953dc9f5aa-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-vzfnr\" (UID: \"9e81d681-fa0d-4789-8762-ee953dc9f5aa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.373528 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhrz8\" (UniqueName: \"kubernetes.io/projected/20718750-ed46-4785-b2ca-0e41dfd093be-kube-api-access-rhrz8\") pod \"nmstate-metrics-9b8c8685d-qgmxm\" (UID: \"20718750-ed46-4785-b2ca-0e41dfd093be\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.376859 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2rj\" (UniqueName: \"kubernetes.io/projected/9e81d681-fa0d-4789-8762-ee953dc9f5aa-kube-api-access-lp2rj\") pod \"nmstate-webhook-5f558f5558-vzfnr\" (UID: \"9e81d681-fa0d-4789-8762-ee953dc9f5aa\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.456914 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-nmstate-lock\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.456968 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzmx4\" (UniqueName: \"kubernetes.io/projected/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-kube-api-access-kzmx4\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.456997 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-dbus-socket\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.457052 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bed2ffb-6685-4495-badf-1c70ea17d8fa-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.457054 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-nmstate-lock\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.457078 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bed2ffb-6685-4495-badf-1c70ea17d8fa-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.457205 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-ovs-socket\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.457268 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-ovs-socket\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.457308 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q4kr\" (UniqueName: \"kubernetes.io/projected/7bed2ffb-6685-4495-badf-1c70ea17d8fa-kube-api-access-2q4kr\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.457634 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-dbus-socket\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.473929 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzmx4\" (UniqueName: \"kubernetes.io/projected/3dec7623-b8d0-4aa6-9a7f-0796475bcaaf-kube-api-access-kzmx4\") pod \"nmstate-handler-6gnfd\" (UID: \"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf\") " pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.493965 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-694c9dc6c5-mshdh"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.494719 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.512579 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-694c9dc6c5-mshdh"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.536626 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.549759 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558402 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90d59783-83c5-4003-b41d-b9b295392503-console-oauth-config\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558450 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnlp8\" (UniqueName: \"kubernetes.io/projected/90d59783-83c5-4003-b41d-b9b295392503-kube-api-access-wnlp8\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558474 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bed2ffb-6685-4495-badf-1c70ea17d8fa-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558496 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90d59783-83c5-4003-b41d-b9b295392503-console-serving-cert\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558514 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bed2ffb-6685-4495-badf-1c70ea17d8fa-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q4kr\" (UniqueName: \"kubernetes.io/projected/7bed2ffb-6685-4495-badf-1c70ea17d8fa-kube-api-access-2q4kr\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558578 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-console-config\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558593 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-service-ca\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-trusted-ca-bundle\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.558632 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-oauth-serving-cert\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.559914 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7bed2ffb-6685-4495-badf-1c70ea17d8fa-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.563437 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7bed2ffb-6685-4495-badf-1c70ea17d8fa-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.564980 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.577264 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q4kr\" (UniqueName: \"kubernetes.io/projected/7bed2ffb-6685-4495-badf-1c70ea17d8fa-kube-api-access-2q4kr\") pod \"nmstate-console-plugin-86f58fcf4-ztp77\" (UID: \"7bed2ffb-6685-4495-badf-1c70ea17d8fa\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: W0311 09:28:06.585737 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dec7623_b8d0_4aa6_9a7f_0796475bcaaf.slice/crio-c07dd0a5974e5cffbd11d8595aa92bcb45a4374aeec5af38bdeaab2818841a59 WatchSource:0}: Error finding container c07dd0a5974e5cffbd11d8595aa92bcb45a4374aeec5af38bdeaab2818841a59: Status 404 returned error can't find the container with id c07dd0a5974e5cffbd11d8595aa92bcb45a4374aeec5af38bdeaab2818841a59 Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.655942 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.659200 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-oauth-serving-cert\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.659250 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90d59783-83c5-4003-b41d-b9b295392503-console-oauth-config\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.659279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnlp8\" (UniqueName: \"kubernetes.io/projected/90d59783-83c5-4003-b41d-b9b295392503-kube-api-access-wnlp8\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.659300 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90d59783-83c5-4003-b41d-b9b295392503-console-serving-cert\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.659349 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-console-config\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.659368 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-service-ca\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.659384 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-trusted-ca-bundle\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.660470 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-trusted-ca-bundle\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.660480 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-oauth-serving-cert\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.661063 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-service-ca\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.661169 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90d59783-83c5-4003-b41d-b9b295392503-console-config\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.666080 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90d59783-83c5-4003-b41d-b9b295392503-console-oauth-config\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.677961 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90d59783-83c5-4003-b41d-b9b295392503-console-serving-cert\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.691155 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnlp8\" (UniqueName: \"kubernetes.io/projected/90d59783-83c5-4003-b41d-b9b295392503-kube-api-access-wnlp8\") pod \"console-694c9dc6c5-mshdh\" (UID: \"90d59783-83c5-4003-b41d-b9b295392503\") " pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.800750 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm"] Mar 11 09:28:06 crc kubenswrapper[4830]: W0311 09:28:06.811857 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20718750_ed46_4785_b2ca_0e41dfd093be.slice/crio-bc4f6e9e1f5cbcbeaa163228bd3d8af0f6fc37e78eaa4a65208030c776901b55 WatchSource:0}: Error finding container bc4f6e9e1f5cbcbeaa163228bd3d8af0f6fc37e78eaa4a65208030c776901b55: Status 404 returned error can't find the container with id bc4f6e9e1f5cbcbeaa163228bd3d8af0f6fc37e78eaa4a65208030c776901b55 Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.814554 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.867250 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.944905 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b49631-e712-4c93-afc6-59aff1ef208c" path="/var/lib/kubelet/pods/a5b49631-e712-4c93-afc6-59aff1ef208c/volumes" Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.954355 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" event={"ID":"20718750-ed46-4785-b2ca-0e41dfd093be","Type":"ContainerStarted","Data":"bc4f6e9e1f5cbcbeaa163228bd3d8af0f6fc37e78eaa4a65208030c776901b55"} Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.957232 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6gnfd" event={"ID":"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf","Type":"ContainerStarted","Data":"c07dd0a5974e5cffbd11d8595aa92bcb45a4374aeec5af38bdeaab2818841a59"} Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.966030 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr"] Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.966585 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" event={"ID":"7bed2ffb-6685-4495-badf-1c70ea17d8fa","Type":"ContainerStarted","Data":"2f4ba768912ecff53f90626e950c84305d002405a68282bfd307a26e1cf18f1c"} Mar 11 09:28:06 crc kubenswrapper[4830]: W0311 09:28:06.979819 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e81d681_fa0d_4789_8762_ee953dc9f5aa.slice/crio-6abd79ed99fcebbe354c0285f5349bc53210cba6a97c8c20a8dbc04c352e6f4e WatchSource:0}: Error finding container 6abd79ed99fcebbe354c0285f5349bc53210cba6a97c8c20a8dbc04c352e6f4e: Status 404 returned error can't find the container with id 6abd79ed99fcebbe354c0285f5349bc53210cba6a97c8c20a8dbc04c352e6f4e Mar 11 09:28:06 crc kubenswrapper[4830]: I0311 09:28:06.996658 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-694c9dc6c5-mshdh"] Mar 11 09:28:07 crc kubenswrapper[4830]: W0311 09:28:07.001142 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d59783_83c5_4003_b41d_b9b295392503.slice/crio-524dfb867b2b4b1d30acc31455cd17d09981a138f6f057ec84ca5d00d491896b WatchSource:0}: Error finding container 524dfb867b2b4b1d30acc31455cd17d09981a138f6f057ec84ca5d00d491896b: Status 404 returned error can't find the container with id 524dfb867b2b4b1d30acc31455cd17d09981a138f6f057ec84ca5d00d491896b Mar 11 09:28:07 crc kubenswrapper[4830]: I0311 09:28:07.980710 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-694c9dc6c5-mshdh" event={"ID":"90d59783-83c5-4003-b41d-b9b295392503","Type":"ContainerStarted","Data":"47199a598314d7458b42bb11e32c429629f3eca51dfc5e14d18aaa5e12ab5845"} Mar 11 09:28:07 crc kubenswrapper[4830]: I0311 09:28:07.983239 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-694c9dc6c5-mshdh" event={"ID":"90d59783-83c5-4003-b41d-b9b295392503","Type":"ContainerStarted","Data":"524dfb867b2b4b1d30acc31455cd17d09981a138f6f057ec84ca5d00d491896b"} Mar 11 09:28:07 crc kubenswrapper[4830]: I0311 09:28:07.983402 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" event={"ID":"9e81d681-fa0d-4789-8762-ee953dc9f5aa","Type":"ContainerStarted","Data":"6abd79ed99fcebbe354c0285f5349bc53210cba6a97c8c20a8dbc04c352e6f4e"} Mar 11 09:28:08 crc kubenswrapper[4830]: I0311 09:28:08.012386 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-694c9dc6c5-mshdh" podStartSLOduration=2.012307894 podStartE2EDuration="2.012307894s" podCreationTimestamp="2026-03-11 09:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:28:08.006679386 +0000 UTC m=+855.787830115" watchObservedRunningTime="2026-03-11 09:28:08.012307894 +0000 UTC m=+855.793458603" Mar 11 09:28:09 crc kubenswrapper[4830]: I0311 09:28:09.997319 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" event={"ID":"9e81d681-fa0d-4789-8762-ee953dc9f5aa","Type":"ContainerStarted","Data":"694a5243701220b31439bc78c25b991ff313b9803beedb943c475375202db7b6"} Mar 11 09:28:09 crc kubenswrapper[4830]: I0311 09:28:09.997817 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:09 crc kubenswrapper[4830]: I0311 09:28:09.999797 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" event={"ID":"7bed2ffb-6685-4495-badf-1c70ea17d8fa","Type":"ContainerStarted","Data":"b5ed3f3c9b5dbca45f2707d4081d91241b1da0cf57061da2045334ee5ab79b17"} Mar 11 09:28:10 crc kubenswrapper[4830]: I0311 09:28:10.002179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" event={"ID":"20718750-ed46-4785-b2ca-0e41dfd093be","Type":"ContainerStarted","Data":"582eb5077f30525ca0b2896bf5d371943ad29704bd1fb296b6d609d30fef46c1"} Mar 11 09:28:10 crc kubenswrapper[4830]: I0311 09:28:10.017933 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" podStartSLOduration=1.255862634 podStartE2EDuration="4.017909172s" podCreationTimestamp="2026-03-11 09:28:06 +0000 UTC" firstStartedPulling="2026-03-11 09:28:06.983221718 +0000 UTC m=+854.764372407" lastFinishedPulling="2026-03-11 09:28:09.745268256 +0000 UTC m=+857.526418945" observedRunningTime="2026-03-11 09:28:10.013393644 +0000 UTC m=+857.794544353" watchObservedRunningTime="2026-03-11 09:28:10.017909172 +0000 UTC m=+857.799059861" Mar 11 09:28:10 crc kubenswrapper[4830]: I0311 09:28:10.036670 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ztp77" podStartSLOduration=1.178338036 podStartE2EDuration="4.036645918s" podCreationTimestamp="2026-03-11 09:28:06 +0000 UTC" firstStartedPulling="2026-03-11 09:28:06.876606684 +0000 UTC m=+854.657757373" lastFinishedPulling="2026-03-11 09:28:09.734914576 +0000 UTC m=+857.516065255" observedRunningTime="2026-03-11 09:28:10.031850954 +0000 UTC m=+857.813001643" watchObservedRunningTime="2026-03-11 09:28:10.036645918 +0000 UTC m=+857.817796607" Mar 11 09:28:11 crc kubenswrapper[4830]: I0311 09:28:11.013154 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6gnfd" event={"ID":"3dec7623-b8d0-4aa6-9a7f-0796475bcaaf","Type":"ContainerStarted","Data":"9883f29b79ed928f729696a7973fee263dda4cf9e50461f72eed5074432c1bf0"} Mar 11 09:28:11 crc kubenswrapper[4830]: I0311 09:28:11.014336 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:11 crc kubenswrapper[4830]: I0311 09:28:11.029692 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6gnfd" podStartSLOduration=1.882712825 podStartE2EDuration="5.029670082s" podCreationTimestamp="2026-03-11 09:28:06 +0000 UTC" firstStartedPulling="2026-03-11 09:28:06.587978799 +0000 UTC m=+854.369129488" lastFinishedPulling="2026-03-11 09:28:09.734936056 +0000 UTC m=+857.516086745" observedRunningTime="2026-03-11 09:28:11.029462396 +0000 UTC m=+858.810613105" watchObservedRunningTime="2026-03-11 09:28:11.029670082 +0000 UTC m=+858.810820771" Mar 11 09:28:13 crc kubenswrapper[4830]: I0311 09:28:13.028571 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" event={"ID":"20718750-ed46-4785-b2ca-0e41dfd093be","Type":"ContainerStarted","Data":"54f35694a2815feeb1feb5685c9539969577f841157f16d271982a9ef94b0b2e"} Mar 11 09:28:13 crc kubenswrapper[4830]: I0311 09:28:13.047578 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qgmxm" podStartSLOduration=1.73369368 podStartE2EDuration="7.047554954s" podCreationTimestamp="2026-03-11 09:28:06 +0000 UTC" firstStartedPulling="2026-03-11 09:28:06.813831661 +0000 UTC m=+854.594982350" lastFinishedPulling="2026-03-11 09:28:12.127692935 +0000 UTC m=+859.908843624" observedRunningTime="2026-03-11 09:28:13.047362728 +0000 UTC m=+860.828513447" watchObservedRunningTime="2026-03-11 09:28:13.047554954 +0000 UTC m=+860.828705673" Mar 11 09:28:13 crc kubenswrapper[4830]: I0311 09:28:13.060736 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:28:13 crc kubenswrapper[4830]: I0311 09:28:13.060796 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:28:13 crc kubenswrapper[4830]: I0311 09:28:13.060852 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:28:13 crc kubenswrapper[4830]: I0311 09:28:13.061656 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f1549afce8227de9820039f9dd4bcf657fcc7950e158e1064942fb283e47f6d"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:28:13 crc kubenswrapper[4830]: I0311 09:28:13.061748 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://9f1549afce8227de9820039f9dd4bcf657fcc7950e158e1064942fb283e47f6d" gracePeriod=600 Mar 11 09:28:14 crc kubenswrapper[4830]: I0311 09:28:14.041391 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="9f1549afce8227de9820039f9dd4bcf657fcc7950e158e1064942fb283e47f6d" exitCode=0 Mar 11 09:28:14 crc kubenswrapper[4830]: I0311 09:28:14.041498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"9f1549afce8227de9820039f9dd4bcf657fcc7950e158e1064942fb283e47f6d"} Mar 11 09:28:14 crc kubenswrapper[4830]: I0311 09:28:14.041823 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"490e43e253d22e49dfc5c2a704ffdefb34fe709ef23f6e9173eecf22518d399e"} Mar 11 09:28:14 crc kubenswrapper[4830]: I0311 09:28:14.041859 4830 scope.go:117] "RemoveContainer" containerID="142e36b46036713ff5cf010b0bd983e98265d8b1e5bd25a219605acc4cae5ae2" Mar 11 09:28:16 crc kubenswrapper[4830]: I0311 09:28:16.815988 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:16 crc kubenswrapper[4830]: I0311 09:28:16.817346 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:16 crc kubenswrapper[4830]: I0311 09:28:16.944587 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:16 crc kubenswrapper[4830]: I0311 09:28:16.967075 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6gnfd" Mar 11 09:28:17 crc kubenswrapper[4830]: I0311 09:28:17.073599 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-694c9dc6c5-mshdh" Mar 11 09:28:17 crc kubenswrapper[4830]: I0311 09:28:17.208504 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-75j46"] Mar 11 09:28:26 crc kubenswrapper[4830]: I0311 09:28:26.560291 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-vzfnr" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.243076 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b5nzf"] Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.245646 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.254917 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5nzf"] Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.320333 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-utilities\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.320621 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-catalog-content\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.320742 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4sgx\" (UniqueName: \"kubernetes.io/projected/21f43104-3861-4b6c-95cf-c99e43f4f556-kube-api-access-d4sgx\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.421950 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-utilities\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.422084 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-catalog-content\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.422157 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4sgx\" (UniqueName: \"kubernetes.io/projected/21f43104-3861-4b6c-95cf-c99e43f4f556-kube-api-access-d4sgx\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.422729 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-catalog-content\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.422943 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-utilities\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.459373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4sgx\" (UniqueName: \"kubernetes.io/projected/21f43104-3861-4b6c-95cf-c99e43f4f556-kube-api-access-d4sgx\") pod \"certified-operators-b5nzf\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:38 crc kubenswrapper[4830]: I0311 09:28:38.577736 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.059042 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5nzf"] Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.225286 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nzf" event={"ID":"21f43104-3861-4b6c-95cf-c99e43f4f556","Type":"ContainerStarted","Data":"000c91f1c932637670263f7f8e38ab0ddf90457034770380ba2a42e907db5f2a"} Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.266190 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql"] Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.267230 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.271427 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.277759 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql"] Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.337848 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.337995 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbwdk\" (UniqueName: \"kubernetes.io/projected/c7147327-7c7b-4a1d-94c7-c684ec5337a0-kube-api-access-xbwdk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.338086 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.439914 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbwdk\" (UniqueName: \"kubernetes.io/projected/c7147327-7c7b-4a1d-94c7-c684ec5337a0-kube-api-access-xbwdk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.440005 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.440098 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.440744 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.440878 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.461507 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbwdk\" (UniqueName: \"kubernetes.io/projected/c7147327-7c7b-4a1d-94c7-c684ec5337a0-kube-api-access-xbwdk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.583990 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:39 crc kubenswrapper[4830]: I0311 09:28:39.799613 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql"] Mar 11 09:28:40 crc kubenswrapper[4830]: I0311 09:28:40.235141 4830 generic.go:334] "Generic (PLEG): container finished" podID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerID="6ddb37a56bae4c1a20a27d94e64d69da44852df1c3e57e1f6198a82c31b9bad5" exitCode=0 Mar 11 09:28:40 crc kubenswrapper[4830]: I0311 09:28:40.235294 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nzf" event={"ID":"21f43104-3861-4b6c-95cf-c99e43f4f556","Type":"ContainerDied","Data":"6ddb37a56bae4c1a20a27d94e64d69da44852df1c3e57e1f6198a82c31b9bad5"} Mar 11 09:28:40 crc kubenswrapper[4830]: I0311 09:28:40.238945 4830 generic.go:334] "Generic (PLEG): container finished" podID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerID="e73abdd95305eab88cf0258fe6acd5a385709bf6c31ae65de18c91c1c261236e" exitCode=0 Mar 11 09:28:40 crc kubenswrapper[4830]: I0311 09:28:40.239085 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" event={"ID":"c7147327-7c7b-4a1d-94c7-c684ec5337a0","Type":"ContainerDied","Data":"e73abdd95305eab88cf0258fe6acd5a385709bf6c31ae65de18c91c1c261236e"} Mar 11 09:28:40 crc kubenswrapper[4830]: I0311 09:28:40.239134 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" event={"ID":"c7147327-7c7b-4a1d-94c7-c684ec5337a0","Type":"ContainerStarted","Data":"0a6e0e7af1a63c5d00eb0d7bd852b86ce53645453a8d5428bc25554bbce81597"} Mar 11 09:28:41 crc kubenswrapper[4830]: I0311 09:28:41.248226 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nzf" event={"ID":"21f43104-3861-4b6c-95cf-c99e43f4f556","Type":"ContainerStarted","Data":"8339eb8aca6d99d2c80f67578ba4f3bc6991f249e94b564c5ccebb74e88e9d1b"} Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.242307 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-75j46" podUID="cd450036-5201-4553-a9de-c08a7a9c9f52" containerName="console" containerID="cri-o://39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83" gracePeriod=15 Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.257720 4830 generic.go:334] "Generic (PLEG): container finished" podID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerID="cea48cc159ac69a30473560da8e2be0e84f428bd5eb0743140cc812ef5ac520f" exitCode=0 Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.257826 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" event={"ID":"c7147327-7c7b-4a1d-94c7-c684ec5337a0","Type":"ContainerDied","Data":"cea48cc159ac69a30473560da8e2be0e84f428bd5eb0743140cc812ef5ac520f"} Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.260211 4830 generic.go:334] "Generic (PLEG): container finished" podID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerID="8339eb8aca6d99d2c80f67578ba4f3bc6991f249e94b564c5ccebb74e88e9d1b" exitCode=0 Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.260256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nzf" event={"ID":"21f43104-3861-4b6c-95cf-c99e43f4f556","Type":"ContainerDied","Data":"8339eb8aca6d99d2c80f67578ba4f3bc6991f249e94b564c5ccebb74e88e9d1b"} Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.599572 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-75j46_cd450036-5201-4553-a9de-c08a7a9c9f52/console/0.log" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.599635 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.688352 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-oauth-serving-cert\") pod \"cd450036-5201-4553-a9de-c08a7a9c9f52\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.688413 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-trusted-ca-bundle\") pod \"cd450036-5201-4553-a9de-c08a7a9c9f52\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.688484 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-service-ca\") pod \"cd450036-5201-4553-a9de-c08a7a9c9f52\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.688520 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-oauth-config\") pod \"cd450036-5201-4553-a9de-c08a7a9c9f52\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.688556 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kc9g\" (UniqueName: \"kubernetes.io/projected/cd450036-5201-4553-a9de-c08a7a9c9f52-kube-api-access-4kc9g\") pod \"cd450036-5201-4553-a9de-c08a7a9c9f52\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.688625 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-console-config\") pod \"cd450036-5201-4553-a9de-c08a7a9c9f52\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.688646 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-serving-cert\") pod \"cd450036-5201-4553-a9de-c08a7a9c9f52\" (UID: \"cd450036-5201-4553-a9de-c08a7a9c9f52\") " Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.689529 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cd450036-5201-4553-a9de-c08a7a9c9f52" (UID: "cd450036-5201-4553-a9de-c08a7a9c9f52"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.689586 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-console-config" (OuterVolumeSpecName: "console-config") pod "cd450036-5201-4553-a9de-c08a7a9c9f52" (UID: "cd450036-5201-4553-a9de-c08a7a9c9f52"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.689632 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-service-ca" (OuterVolumeSpecName: "service-ca") pod "cd450036-5201-4553-a9de-c08a7a9c9f52" (UID: "cd450036-5201-4553-a9de-c08a7a9c9f52"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.689749 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cd450036-5201-4553-a9de-c08a7a9c9f52" (UID: "cd450036-5201-4553-a9de-c08a7a9c9f52"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.695211 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cd450036-5201-4553-a9de-c08a7a9c9f52" (UID: "cd450036-5201-4553-a9de-c08a7a9c9f52"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.695665 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cd450036-5201-4553-a9de-c08a7a9c9f52" (UID: "cd450036-5201-4553-a9de-c08a7a9c9f52"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.695698 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd450036-5201-4553-a9de-c08a7a9c9f52-kube-api-access-4kc9g" (OuterVolumeSpecName: "kube-api-access-4kc9g") pod "cd450036-5201-4553-a9de-c08a7a9c9f52" (UID: "cd450036-5201-4553-a9de-c08a7a9c9f52"). InnerVolumeSpecName "kube-api-access-4kc9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.790458 4830 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.790529 4830 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.790554 4830 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-service-ca\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.790576 4830 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.790599 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kc9g\" (UniqueName: \"kubernetes.io/projected/cd450036-5201-4553-a9de-c08a7a9c9f52-kube-api-access-4kc9g\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.790673 4830 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd450036-5201-4553-a9de-c08a7a9c9f52-console-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.790697 4830 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd450036-5201-4553-a9de-c08a7a9c9f52-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.830908 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c85g7"] Mar 11 09:28:42 crc kubenswrapper[4830]: E0311 09:28:42.831299 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd450036-5201-4553-a9de-c08a7a9c9f52" containerName="console" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.831329 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd450036-5201-4553-a9de-c08a7a9c9f52" containerName="console" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.831504 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd450036-5201-4553-a9de-c08a7a9c9f52" containerName="console" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.835348 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.841283 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c85g7"] Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.993092 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbjgs\" (UniqueName: \"kubernetes.io/projected/3248986c-2bdb-4095-87d0-07eaf6acd7b1-kube-api-access-sbjgs\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.993184 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3248986c-2bdb-4095-87d0-07eaf6acd7b1-utilities\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:42 crc kubenswrapper[4830]: I0311 09:28:42.993394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3248986c-2bdb-4095-87d0-07eaf6acd7b1-catalog-content\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.095251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3248986c-2bdb-4095-87d0-07eaf6acd7b1-utilities\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.095536 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3248986c-2bdb-4095-87d0-07eaf6acd7b1-catalog-content\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.095762 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbjgs\" (UniqueName: \"kubernetes.io/projected/3248986c-2bdb-4095-87d0-07eaf6acd7b1-kube-api-access-sbjgs\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.095813 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3248986c-2bdb-4095-87d0-07eaf6acd7b1-utilities\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.096168 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3248986c-2bdb-4095-87d0-07eaf6acd7b1-catalog-content\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.115125 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbjgs\" (UniqueName: \"kubernetes.io/projected/3248986c-2bdb-4095-87d0-07eaf6acd7b1-kube-api-access-sbjgs\") pod \"redhat-operators-c85g7\" (UID: \"3248986c-2bdb-4095-87d0-07eaf6acd7b1\") " pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.186790 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.277078 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nzf" event={"ID":"21f43104-3861-4b6c-95cf-c99e43f4f556","Type":"ContainerStarted","Data":"4404acecbb2ea282c1985d51fd3428c061067437c32ec30c928ab2168f411705"} Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.282812 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-75j46_cd450036-5201-4553-a9de-c08a7a9c9f52/console/0.log" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.282859 4830 generic.go:334] "Generic (PLEG): container finished" podID="cd450036-5201-4553-a9de-c08a7a9c9f52" containerID="39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83" exitCode=2 Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.282946 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-75j46" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.283997 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-75j46" event={"ID":"cd450036-5201-4553-a9de-c08a7a9c9f52","Type":"ContainerDied","Data":"39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83"} Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.284058 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-75j46" event={"ID":"cd450036-5201-4553-a9de-c08a7a9c9f52","Type":"ContainerDied","Data":"4155b31ee39b8de18e3327f73d3c10bca7896fd617502a3b86b7df2b5899ff62"} Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.284081 4830 scope.go:117] "RemoveContainer" containerID="39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.286223 4830 generic.go:334] "Generic (PLEG): container finished" podID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerID="14dbcb8bd312984f7da4762dfd60f2aacd50dabf0cb2f4ed142a40d9b404d64e" exitCode=0 Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.286264 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" event={"ID":"c7147327-7c7b-4a1d-94c7-c684ec5337a0","Type":"ContainerDied","Data":"14dbcb8bd312984f7da4762dfd60f2aacd50dabf0cb2f4ed142a40d9b404d64e"} Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.296366 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b5nzf" podStartSLOduration=2.774464912 podStartE2EDuration="5.296344777s" podCreationTimestamp="2026-03-11 09:28:38 +0000 UTC" firstStartedPulling="2026-03-11 09:28:40.238202053 +0000 UTC m=+888.019352782" lastFinishedPulling="2026-03-11 09:28:42.760081918 +0000 UTC m=+890.541232647" observedRunningTime="2026-03-11 09:28:43.294094493 +0000 UTC m=+891.075245202" watchObservedRunningTime="2026-03-11 09:28:43.296344777 +0000 UTC m=+891.077495466" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.308754 4830 scope.go:117] "RemoveContainer" containerID="39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83" Mar 11 09:28:43 crc kubenswrapper[4830]: E0311 09:28:43.309263 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83\": container with ID starting with 39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83 not found: ID does not exist" containerID="39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.309312 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83"} err="failed to get container status \"39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83\": rpc error: code = NotFound desc = could not find container \"39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83\": container with ID starting with 39ea5b58f0f70e3c870b016cd13728b17892edebe43857e73334f17b1ca04c83 not found: ID does not exist" Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.372467 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-75j46"] Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.377539 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-75j46"] Mar 11 09:28:43 crc kubenswrapper[4830]: I0311 09:28:43.633824 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c85g7"] Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.295052 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c85g7" event={"ID":"3248986c-2bdb-4095-87d0-07eaf6acd7b1","Type":"ContainerStarted","Data":"d1b9f706ee5af63fc7338d64c6044f2820c8202aca40ab2d9a70eb353e55ca74"} Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.527474 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.625896 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbwdk\" (UniqueName: \"kubernetes.io/projected/c7147327-7c7b-4a1d-94c7-c684ec5337a0-kube-api-access-xbwdk\") pod \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.626045 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-util\") pod \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.626078 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-bundle\") pod \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\" (UID: \"c7147327-7c7b-4a1d-94c7-c684ec5337a0\") " Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.627065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-bundle" (OuterVolumeSpecName: "bundle") pod "c7147327-7c7b-4a1d-94c7-c684ec5337a0" (UID: "c7147327-7c7b-4a1d-94c7-c684ec5337a0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.633812 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7147327-7c7b-4a1d-94c7-c684ec5337a0-kube-api-access-xbwdk" (OuterVolumeSpecName: "kube-api-access-xbwdk") pod "c7147327-7c7b-4a1d-94c7-c684ec5337a0" (UID: "c7147327-7c7b-4a1d-94c7-c684ec5337a0"). InnerVolumeSpecName "kube-api-access-xbwdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.640493 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-util" (OuterVolumeSpecName: "util") pod "c7147327-7c7b-4a1d-94c7-c684ec5337a0" (UID: "c7147327-7c7b-4a1d-94c7-c684ec5337a0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.727895 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbwdk\" (UniqueName: \"kubernetes.io/projected/c7147327-7c7b-4a1d-94c7-c684ec5337a0-kube-api-access-xbwdk\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.727936 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-util\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.727949 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7147327-7c7b-4a1d-94c7-c684ec5337a0-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:44 crc kubenswrapper[4830]: I0311 09:28:44.939999 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd450036-5201-4553-a9de-c08a7a9c9f52" path="/var/lib/kubelet/pods/cd450036-5201-4553-a9de-c08a7a9c9f52/volumes" Mar 11 09:28:45 crc kubenswrapper[4830]: I0311 09:28:45.304314 4830 generic.go:334] "Generic (PLEG): container finished" podID="3248986c-2bdb-4095-87d0-07eaf6acd7b1" containerID="9aac0752bbf1309198ec8cac877885e655767c8ecff3a554f93506c46762cbfe" exitCode=0 Mar 11 09:28:45 crc kubenswrapper[4830]: I0311 09:28:45.304461 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c85g7" event={"ID":"3248986c-2bdb-4095-87d0-07eaf6acd7b1","Type":"ContainerDied","Data":"9aac0752bbf1309198ec8cac877885e655767c8ecff3a554f93506c46762cbfe"} Mar 11 09:28:45 crc kubenswrapper[4830]: I0311 09:28:45.307553 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" event={"ID":"c7147327-7c7b-4a1d-94c7-c684ec5337a0","Type":"ContainerDied","Data":"0a6e0e7af1a63c5d00eb0d7bd852b86ce53645453a8d5428bc25554bbce81597"} Mar 11 09:28:45 crc kubenswrapper[4830]: I0311 09:28:45.307594 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a6e0e7af1a63c5d00eb0d7bd852b86ce53645453a8d5428bc25554bbce81597" Mar 11 09:28:45 crc kubenswrapper[4830]: I0311 09:28:45.307667 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql" Mar 11 09:28:48 crc kubenswrapper[4830]: I0311 09:28:48.578658 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:48 crc kubenswrapper[4830]: I0311 09:28:48.579010 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:48 crc kubenswrapper[4830]: I0311 09:28:48.623095 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:49 crc kubenswrapper[4830]: I0311 09:28:49.378531 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.225656 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jmtgc"] Mar 11 09:28:50 crc kubenswrapper[4830]: E0311 09:28:50.226757 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerName="util" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.226777 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerName="util" Mar 11 09:28:50 crc kubenswrapper[4830]: E0311 09:28:50.226797 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerName="pull" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.226807 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerName="pull" Mar 11 09:28:50 crc kubenswrapper[4830]: E0311 09:28:50.226852 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerName="extract" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.226861 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerName="extract" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.227002 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7147327-7c7b-4a1d-94c7-c684ec5337a0" containerName="extract" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.228111 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.233685 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmtgc"] Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.421961 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-utilities\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.422254 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pmnn\" (UniqueName: \"kubernetes.io/projected/45e1ee42-ff33-4d33-b167-50d8bc347d3e-kube-api-access-9pmnn\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.422370 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-catalog-content\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.523392 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-utilities\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.523505 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pmnn\" (UniqueName: \"kubernetes.io/projected/45e1ee42-ff33-4d33-b167-50d8bc347d3e-kube-api-access-9pmnn\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.523534 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-catalog-content\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.523949 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-utilities\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.523989 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-catalog-content\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.556072 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pmnn\" (UniqueName: \"kubernetes.io/projected/45e1ee42-ff33-4d33-b167-50d8bc347d3e-kube-api-access-9pmnn\") pod \"redhat-marketplace-jmtgc\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:50 crc kubenswrapper[4830]: I0311 09:28:50.854274 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:28:52 crc kubenswrapper[4830]: I0311 09:28:52.230513 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5nzf"] Mar 11 09:28:52 crc kubenswrapper[4830]: I0311 09:28:52.231078 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b5nzf" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="registry-server" containerID="cri-o://4404acecbb2ea282c1985d51fd3428c061067437c32ec30c928ab2168f411705" gracePeriod=2 Mar 11 09:28:53 crc kubenswrapper[4830]: I0311 09:28:53.358095 4830 generic.go:334] "Generic (PLEG): container finished" podID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerID="4404acecbb2ea282c1985d51fd3428c061067437c32ec30c928ab2168f411705" exitCode=0 Mar 11 09:28:53 crc kubenswrapper[4830]: I0311 09:28:53.358138 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nzf" event={"ID":"21f43104-3861-4b6c-95cf-c99e43f4f556","Type":"ContainerDied","Data":"4404acecbb2ea282c1985d51fd3428c061067437c32ec30c928ab2168f411705"} Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.052906 4830 scope.go:117] "RemoveContainer" containerID="35010b390229211ea793b8622ff91560faff48c170c0b0052342b1f8ee3ad633" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.823621 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj"] Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.824254 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.836422 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.836796 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.837006 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xmtgs" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.837161 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.837301 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.858297 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj"] Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.993134 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m798p\" (UniqueName: \"kubernetes.io/projected/e3733633-d23b-4ef9-90cf-89614677589d-kube-api-access-m798p\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.993213 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3733633-d23b-4ef9-90cf-89614677589d-webhook-cert\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:54 crc kubenswrapper[4830]: I0311 09:28:54.993275 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3733633-d23b-4ef9-90cf-89614677589d-apiservice-cert\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.060674 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r"] Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.061637 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.066489 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-hghvs" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.066696 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.066738 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.076093 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r"] Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.094322 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3733633-d23b-4ef9-90cf-89614677589d-webhook-cert\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.094387 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3733633-d23b-4ef9-90cf-89614677589d-apiservice-cert\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.094417 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m798p\" (UniqueName: \"kubernetes.io/projected/e3733633-d23b-4ef9-90cf-89614677589d-kube-api-access-m798p\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.101248 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3733633-d23b-4ef9-90cf-89614677589d-webhook-cert\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.102603 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3733633-d23b-4ef9-90cf-89614677589d-apiservice-cert\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.110603 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m798p\" (UniqueName: \"kubernetes.io/projected/e3733633-d23b-4ef9-90cf-89614677589d-kube-api-access-m798p\") pod \"metallb-operator-controller-manager-85c56b6668-mc4fj\" (UID: \"e3733633-d23b-4ef9-90cf-89614677589d\") " pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.141354 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.195637 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/847b6273-e498-4025-a834-41173cfce564-webhook-cert\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.195693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h297\" (UniqueName: \"kubernetes.io/projected/847b6273-e498-4025-a834-41173cfce564-kube-api-access-8h297\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.195752 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/847b6273-e498-4025-a834-41173cfce564-apiservice-cert\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.297365 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/847b6273-e498-4025-a834-41173cfce564-webhook-cert\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.297417 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h297\" (UniqueName: \"kubernetes.io/projected/847b6273-e498-4025-a834-41173cfce564-kube-api-access-8h297\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.297452 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/847b6273-e498-4025-a834-41173cfce564-apiservice-cert\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.302065 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/847b6273-e498-4025-a834-41173cfce564-apiservice-cert\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.302530 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/847b6273-e498-4025-a834-41173cfce564-webhook-cert\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.318796 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h297\" (UniqueName: \"kubernetes.io/projected/847b6273-e498-4025-a834-41173cfce564-kube-api-access-8h297\") pod \"metallb-operator-webhook-server-d9865c9bc-zgs5r\" (UID: \"847b6273-e498-4025-a834-41173cfce564\") " pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:55 crc kubenswrapper[4830]: I0311 09:28:55.381461 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.275973 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.314576 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmtgc"] Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.377331 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmtgc" event={"ID":"45e1ee42-ff33-4d33-b167-50d8bc347d3e","Type":"ContainerStarted","Data":"f1dd482e4911561d6bd5c704a7d269cebfa1a1390bbb7f7b08e9ac1898e0694b"} Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.382391 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nzf" event={"ID":"21f43104-3861-4b6c-95cf-c99e43f4f556","Type":"ContainerDied","Data":"000c91f1c932637670263f7f8e38ab0ddf90457034770380ba2a42e907db5f2a"} Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.382455 4830 scope.go:117] "RemoveContainer" containerID="4404acecbb2ea282c1985d51fd3428c061067437c32ec30c928ab2168f411705" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.382522 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nzf" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.404193 4830 scope.go:117] "RemoveContainer" containerID="8339eb8aca6d99d2c80f67578ba4f3bc6991f249e94b564c5ccebb74e88e9d1b" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.412144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-catalog-content\") pod \"21f43104-3861-4b6c-95cf-c99e43f4f556\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.412174 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4sgx\" (UniqueName: \"kubernetes.io/projected/21f43104-3861-4b6c-95cf-c99e43f4f556-kube-api-access-d4sgx\") pod \"21f43104-3861-4b6c-95cf-c99e43f4f556\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.412220 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-utilities\") pod \"21f43104-3861-4b6c-95cf-c99e43f4f556\" (UID: \"21f43104-3861-4b6c-95cf-c99e43f4f556\") " Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.413392 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-utilities" (OuterVolumeSpecName: "utilities") pod "21f43104-3861-4b6c-95cf-c99e43f4f556" (UID: "21f43104-3861-4b6c-95cf-c99e43f4f556"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.428681 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f43104-3861-4b6c-95cf-c99e43f4f556-kube-api-access-d4sgx" (OuterVolumeSpecName: "kube-api-access-d4sgx") pod "21f43104-3861-4b6c-95cf-c99e43f4f556" (UID: "21f43104-3861-4b6c-95cf-c99e43f4f556"). InnerVolumeSpecName "kube-api-access-d4sgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.444911 4830 scope.go:117] "RemoveContainer" containerID="6ddb37a56bae4c1a20a27d94e64d69da44852df1c3e57e1f6198a82c31b9bad5" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.504429 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21f43104-3861-4b6c-95cf-c99e43f4f556" (UID: "21f43104-3861-4b6c-95cf-c99e43f4f556"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.516892 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.516931 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4sgx\" (UniqueName: \"kubernetes.io/projected/21f43104-3861-4b6c-95cf-c99e43f4f556-kube-api-access-d4sgx\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.516942 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f43104-3861-4b6c-95cf-c99e43f4f556-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.603925 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r"] Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.608076 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj"] Mar 11 09:28:56 crc kubenswrapper[4830]: W0311 09:28:56.616303 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3733633_d23b_4ef9_90cf_89614677589d.slice/crio-79acbe0bac88e467eb506dcffc8d9427cd2d6d71002d13779dab7389be3a9322 WatchSource:0}: Error finding container 79acbe0bac88e467eb506dcffc8d9427cd2d6d71002d13779dab7389be3a9322: Status 404 returned error can't find the container with id 79acbe0bac88e467eb506dcffc8d9427cd2d6d71002d13779dab7389be3a9322 Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.713508 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5nzf"] Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.716715 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b5nzf"] Mar 11 09:28:56 crc kubenswrapper[4830]: I0311 09:28:56.938147 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" path="/var/lib/kubelet/pods/21f43104-3861-4b6c-95cf-c99e43f4f556/volumes" Mar 11 09:28:57 crc kubenswrapper[4830]: I0311 09:28:57.389398 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" event={"ID":"847b6273-e498-4025-a834-41173cfce564","Type":"ContainerStarted","Data":"99344a38fd89b9f24ec1adf6eb5a3751ed3a9ed5402cd8f1ca63d0493982fbb6"} Mar 11 09:28:57 crc kubenswrapper[4830]: I0311 09:28:57.390439 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" event={"ID":"e3733633-d23b-4ef9-90cf-89614677589d","Type":"ContainerStarted","Data":"79acbe0bac88e467eb506dcffc8d9427cd2d6d71002d13779dab7389be3a9322"} Mar 11 09:28:57 crc kubenswrapper[4830]: I0311 09:28:57.392303 4830 generic.go:334] "Generic (PLEG): container finished" podID="3248986c-2bdb-4095-87d0-07eaf6acd7b1" containerID="da7ec92b7ec89646cdaa7e31ea4984fef60323550ce3bf3267c4a9db2d5f8c68" exitCode=0 Mar 11 09:28:57 crc kubenswrapper[4830]: I0311 09:28:57.392376 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c85g7" event={"ID":"3248986c-2bdb-4095-87d0-07eaf6acd7b1","Type":"ContainerDied","Data":"da7ec92b7ec89646cdaa7e31ea4984fef60323550ce3bf3267c4a9db2d5f8c68"} Mar 11 09:28:57 crc kubenswrapper[4830]: I0311 09:28:57.394742 4830 generic.go:334] "Generic (PLEG): container finished" podID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerID="9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe" exitCode=0 Mar 11 09:28:57 crc kubenswrapper[4830]: I0311 09:28:57.394771 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmtgc" event={"ID":"45e1ee42-ff33-4d33-b167-50d8bc347d3e","Type":"ContainerDied","Data":"9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe"} Mar 11 09:28:58 crc kubenswrapper[4830]: I0311 09:28:58.408536 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c85g7" event={"ID":"3248986c-2bdb-4095-87d0-07eaf6acd7b1","Type":"ContainerStarted","Data":"d5ea4d146295bb9f2bdb95773f1ee24a94a591fee6d008dace67c0f17be5f5c8"} Mar 11 09:28:58 crc kubenswrapper[4830]: I0311 09:28:58.433058 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c85g7" podStartSLOduration=3.967390628 podStartE2EDuration="16.433037123s" podCreationTimestamp="2026-03-11 09:28:42 +0000 UTC" firstStartedPulling="2026-03-11 09:28:45.308175288 +0000 UTC m=+893.089325977" lastFinishedPulling="2026-03-11 09:28:57.773821783 +0000 UTC m=+905.554972472" observedRunningTime="2026-03-11 09:28:58.428111485 +0000 UTC m=+906.209262194" watchObservedRunningTime="2026-03-11 09:28:58.433037123 +0000 UTC m=+906.214187832" Mar 11 09:28:59 crc kubenswrapper[4830]: I0311 09:28:59.415440 4830 generic.go:334] "Generic (PLEG): container finished" podID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerID="3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b" exitCode=0 Mar 11 09:28:59 crc kubenswrapper[4830]: I0311 09:28:59.415531 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmtgc" event={"ID":"45e1ee42-ff33-4d33-b167-50d8bc347d3e","Type":"ContainerDied","Data":"3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b"} Mar 11 09:29:03 crc kubenswrapper[4830]: I0311 09:29:03.187667 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:29:03 crc kubenswrapper[4830]: I0311 09:29:03.188044 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:29:04 crc kubenswrapper[4830]: I0311 09:29:04.228685 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c85g7" podUID="3248986c-2bdb-4095-87d0-07eaf6acd7b1" containerName="registry-server" probeResult="failure" output=< Mar 11 09:29:04 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 09:29:04 crc kubenswrapper[4830]: > Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.479881 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" event={"ID":"e3733633-d23b-4ef9-90cf-89614677589d","Type":"ContainerStarted","Data":"e3c5e4e26567c99e3d7eb27935d75a603e8c250b635e831c989a058bab20ab0a"} Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.480210 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.482148 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmtgc" event={"ID":"45e1ee42-ff33-4d33-b167-50d8bc347d3e","Type":"ContainerStarted","Data":"1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29"} Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.483700 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" event={"ID":"847b6273-e498-4025-a834-41173cfce564","Type":"ContainerStarted","Data":"d9700dad4dfb21634da28c852acec0a4d072b4c3545557cb5cd47b4b9b67624d"} Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.483846 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.503447 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" podStartSLOduration=3.36518438 podStartE2EDuration="14.503422333s" podCreationTimestamp="2026-03-11 09:28:54 +0000 UTC" firstStartedPulling="2026-03-11 09:28:56.619672404 +0000 UTC m=+904.400823093" lastFinishedPulling="2026-03-11 09:29:07.757910347 +0000 UTC m=+915.539061046" observedRunningTime="2026-03-11 09:29:08.498038482 +0000 UTC m=+916.279189191" watchObservedRunningTime="2026-03-11 09:29:08.503422333 +0000 UTC m=+916.284573042" Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.535845 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jmtgc" podStartSLOduration=8.174459925 podStartE2EDuration="18.535825374s" podCreationTimestamp="2026-03-11 09:28:50 +0000 UTC" firstStartedPulling="2026-03-11 09:28:57.395799298 +0000 UTC m=+905.176949987" lastFinishedPulling="2026-03-11 09:29:07.757164747 +0000 UTC m=+915.538315436" observedRunningTime="2026-03-11 09:29:08.533437308 +0000 UTC m=+916.314588037" watchObservedRunningTime="2026-03-11 09:29:08.535825374 +0000 UTC m=+916.316976063" Mar 11 09:29:08 crc kubenswrapper[4830]: I0311 09:29:08.551597 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" podStartSLOduration=2.394616869 podStartE2EDuration="13.551577783s" podCreationTimestamp="2026-03-11 09:28:55 +0000 UTC" firstStartedPulling="2026-03-11 09:28:56.616628589 +0000 UTC m=+904.397779268" lastFinishedPulling="2026-03-11 09:29:07.773589493 +0000 UTC m=+915.554740182" observedRunningTime="2026-03-11 09:29:08.550116072 +0000 UTC m=+916.331266761" watchObservedRunningTime="2026-03-11 09:29:08.551577783 +0000 UTC m=+916.332728472" Mar 11 09:29:10 crc kubenswrapper[4830]: I0311 09:29:10.854776 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:29:10 crc kubenswrapper[4830]: I0311 09:29:10.855674 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:29:10 crc kubenswrapper[4830]: I0311 09:29:10.897186 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:29:12 crc kubenswrapper[4830]: I0311 09:29:12.553397 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:29:13 crc kubenswrapper[4830]: I0311 09:29:13.269062 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:29:14 crc kubenswrapper[4830]: I0311 09:29:14.969966 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c85g7" Mar 11 09:29:15 crc kubenswrapper[4830]: I0311 09:29:15.006897 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmtgc"] Mar 11 09:29:15 crc kubenswrapper[4830]: I0311 09:29:15.865061 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c85g7"] Mar 11 09:29:15 crc kubenswrapper[4830]: I0311 09:29:15.976972 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jmtgc" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="registry-server" containerID="cri-o://1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29" gracePeriod=2 Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.418294 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qn82n"] Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.418823 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qn82n" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="registry-server" containerID="cri-o://ef8f038907985318cf60fb400b083a35a9e9dd64392811c9742322d6bf82d113" gracePeriod=2 Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.624938 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.674048 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pmnn\" (UniqueName: \"kubernetes.io/projected/45e1ee42-ff33-4d33-b167-50d8bc347d3e-kube-api-access-9pmnn\") pod \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.674483 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-catalog-content\") pod \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.674657 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-utilities\") pod \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\" (UID: \"45e1ee42-ff33-4d33-b167-50d8bc347d3e\") " Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.675486 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-utilities" (OuterVolumeSpecName: "utilities") pod "45e1ee42-ff33-4d33-b167-50d8bc347d3e" (UID: "45e1ee42-ff33-4d33-b167-50d8bc347d3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.695272 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e1ee42-ff33-4d33-b167-50d8bc347d3e-kube-api-access-9pmnn" (OuterVolumeSpecName: "kube-api-access-9pmnn") pod "45e1ee42-ff33-4d33-b167-50d8bc347d3e" (UID: "45e1ee42-ff33-4d33-b167-50d8bc347d3e"). InnerVolumeSpecName "kube-api-access-9pmnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.699492 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45e1ee42-ff33-4d33-b167-50d8bc347d3e" (UID: "45e1ee42-ff33-4d33-b167-50d8bc347d3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.776195 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pmnn\" (UniqueName: \"kubernetes.io/projected/45e1ee42-ff33-4d33-b167-50d8bc347d3e-kube-api-access-9pmnn\") on node \"crc\" DevicePath \"\"" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.776514 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.776529 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45e1ee42-ff33-4d33-b167-50d8bc347d3e-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.985611 4830 generic.go:334] "Generic (PLEG): container finished" podID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerID="1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29" exitCode=0 Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.985681 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmtgc" event={"ID":"45e1ee42-ff33-4d33-b167-50d8bc347d3e","Type":"ContainerDied","Data":"1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29"} Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.985710 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jmtgc" event={"ID":"45e1ee42-ff33-4d33-b167-50d8bc347d3e","Type":"ContainerDied","Data":"f1dd482e4911561d6bd5c704a7d269cebfa1a1390bbb7f7b08e9ac1898e0694b"} Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.985731 4830 scope.go:117] "RemoveContainer" containerID="1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29" Mar 11 09:29:16 crc kubenswrapper[4830]: I0311 09:29:16.985869 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jmtgc" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.003086 4830 generic.go:334] "Generic (PLEG): container finished" podID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerID="ef8f038907985318cf60fb400b083a35a9e9dd64392811c9742322d6bf82d113" exitCode=0 Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.003123 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn82n" event={"ID":"86ab19b6-db7c-4c64-a5cb-cc60d48e1570","Type":"ContainerDied","Data":"ef8f038907985318cf60fb400b083a35a9e9dd64392811c9742322d6bf82d113"} Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.009525 4830 scope.go:117] "RemoveContainer" containerID="3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.016361 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmtgc"] Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.021323 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jmtgc"] Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.031682 4830 scope.go:117] "RemoveContainer" containerID="9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.049040 4830 scope.go:117] "RemoveContainer" containerID="1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29" Mar 11 09:29:17 crc kubenswrapper[4830]: E0311 09:29:17.049684 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29\": container with ID starting with 1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29 not found: ID does not exist" containerID="1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.049789 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29"} err="failed to get container status \"1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29\": rpc error: code = NotFound desc = could not find container \"1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29\": container with ID starting with 1b64e2bdddccfd45bc1836b19cb9d7ecc36f353fcdaad13a4843f7ccb69e4d29 not found: ID does not exist" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.049871 4830 scope.go:117] "RemoveContainer" containerID="3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b" Mar 11 09:29:17 crc kubenswrapper[4830]: E0311 09:29:17.050328 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b\": container with ID starting with 3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b not found: ID does not exist" containerID="3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.050427 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b"} err="failed to get container status \"3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b\": rpc error: code = NotFound desc = could not find container \"3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b\": container with ID starting with 3f558c19ce33a908759e650dabab95873429a638146f64f5a762f974909deb2b not found: ID does not exist" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.050515 4830 scope.go:117] "RemoveContainer" containerID="9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe" Mar 11 09:29:17 crc kubenswrapper[4830]: E0311 09:29:17.050953 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe\": container with ID starting with 9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe not found: ID does not exist" containerID="9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.051055 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe"} err="failed to get container status \"9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe\": rpc error: code = NotFound desc = could not find container \"9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe\": container with ID starting with 9e0bf7027c9a709f4ee5b8e1f38e2495319dbd1731d90f3f691ad723ce96b0fe not found: ID does not exist" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.310898 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.484851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-utilities\") pod \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.484927 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdg92\" (UniqueName: \"kubernetes.io/projected/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-kube-api-access-zdg92\") pod \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.484988 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-catalog-content\") pod \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\" (UID: \"86ab19b6-db7c-4c64-a5cb-cc60d48e1570\") " Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.485762 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-utilities" (OuterVolumeSpecName: "utilities") pod "86ab19b6-db7c-4c64-a5cb-cc60d48e1570" (UID: "86ab19b6-db7c-4c64-a5cb-cc60d48e1570"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.492204 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-kube-api-access-zdg92" (OuterVolumeSpecName: "kube-api-access-zdg92") pod "86ab19b6-db7c-4c64-a5cb-cc60d48e1570" (UID: "86ab19b6-db7c-4c64-a5cb-cc60d48e1570"). InnerVolumeSpecName "kube-api-access-zdg92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.586338 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.586381 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdg92\" (UniqueName: \"kubernetes.io/projected/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-kube-api-access-zdg92\") on node \"crc\" DevicePath \"\"" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.621836 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86ab19b6-db7c-4c64-a5cb-cc60d48e1570" (UID: "86ab19b6-db7c-4c64-a5cb-cc60d48e1570"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:29:17 crc kubenswrapper[4830]: I0311 09:29:17.687654 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ab19b6-db7c-4c64-a5cb-cc60d48e1570-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.013251 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qn82n" event={"ID":"86ab19b6-db7c-4c64-a5cb-cc60d48e1570","Type":"ContainerDied","Data":"65138cbca454c2616a8b8fe79277c3bf3cb5d2180c60a48fb07b66f366065d8f"} Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.013296 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qn82n" Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.013305 4830 scope.go:117] "RemoveContainer" containerID="ef8f038907985318cf60fb400b083a35a9e9dd64392811c9742322d6bf82d113" Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.033345 4830 scope.go:117] "RemoveContainer" containerID="ce4bac958f32797ea6fe525f84e94b8409440d09c97b9a0588f6dd265a6a8fee" Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.045800 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qn82n"] Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.050141 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qn82n"] Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.082399 4830 scope.go:117] "RemoveContainer" containerID="5dd45e289ff48d0e5b02f1dfa232b824736d9a061e8183e74916274a005eb00f" Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.939512 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" path="/var/lib/kubelet/pods/45e1ee42-ff33-4d33-b167-50d8bc347d3e/volumes" Mar 11 09:29:18 crc kubenswrapper[4830]: I0311 09:29:18.940514 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" path="/var/lib/kubelet/pods/86ab19b6-db7c-4c64-a5cb-cc60d48e1570/volumes" Mar 11 09:29:25 crc kubenswrapper[4830]: I0311 09:29:25.389324 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-d9865c9bc-zgs5r" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.143628 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85c56b6668-mc4fj" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.837697 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2"] Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.837954 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.837966 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838004 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="extract-utilities" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838012 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="extract-utilities" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838042 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="extract-content" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838050 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="extract-content" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838060 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838066 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838076 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="extract-utilities" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838082 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="extract-utilities" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838090 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="extract-utilities" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838095 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="extract-utilities" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838104 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="extract-content" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838111 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="extract-content" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838120 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="extract-content" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838126 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="extract-content" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.838138 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838144 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838236 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ab19b6-db7c-4c64-a5cb-cc60d48e1570" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838249 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e1ee42-ff33-4d33-b167-50d8bc347d3e" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838260 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f43104-3861-4b6c-95cf-c99e43f4f556" containerName="registry-server" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.838668 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.840701 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7vbxt" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.842647 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.845511 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mvhbg"] Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.849525 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.853578 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2"] Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.854955 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8f7\" (UniqueName: \"kubernetes.io/projected/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-kube-api-access-xt8f7\") pod \"frr-k8s-webhook-server-bcc4b6f68-svpm2\" (UID: \"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.855005 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-svpm2\" (UID: \"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.858900 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.859100 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.900874 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gmrbg"] Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.901774 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gmrbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.904980 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.905036 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.905262 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wcggv" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.905803 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.927979 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-qg75l"] Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.928907 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.930552 4830 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.941299 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qg75l"] Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956410 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8f7\" (UniqueName: \"kubernetes.io/projected/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-kube-api-access-xt8f7\") pod \"frr-k8s-webhook-server-bcc4b6f68-svpm2\" (UID: \"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956453 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-svpm2\" (UID: \"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956479 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-frr-conf\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956500 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-metrics\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956518 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-frr-sockets\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956558 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969c667a-e499-4c6a-9da3-d7813886c794-metrics-certs\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/969c667a-e499-4c6a-9da3-d7813886c794-frr-startup\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956597 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2d8\" (UniqueName: \"kubernetes.io/projected/969c667a-e499-4c6a-9da3-d7813886c794-kube-api-access-jh2d8\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.956636 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-reloader\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.957603 4830 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 11 09:29:45 crc kubenswrapper[4830]: E0311 09:29:45.957656 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-cert podName:bd1c2f0c-c126-4cd9-863d-6ec94f3920ba nodeName:}" failed. No retries permitted until 2026-03-11 09:29:46.45763764 +0000 UTC m=+954.238788329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-cert") pod "frr-k8s-webhook-server-bcc4b6f68-svpm2" (UID: "bd1c2f0c-c126-4cd9-863d-6ec94f3920ba") : secret "frr-k8s-webhook-server-cert" not found Mar 11 09:29:45 crc kubenswrapper[4830]: I0311 09:29:45.992689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8f7\" (UniqueName: \"kubernetes.io/projected/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-kube-api-access-xt8f7\") pod \"frr-k8s-webhook-server-bcc4b6f68-svpm2\" (UID: \"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.057708 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-cert\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058059 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8rcw\" (UniqueName: \"kubernetes.io/projected/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-kube-api-access-f8rcw\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058106 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-metrics-certs\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058130 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969c667a-e499-4c6a-9da3-d7813886c794-metrics-certs\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058149 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/969c667a-e499-4c6a-9da3-d7813886c794-frr-startup\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058167 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2d8\" (UniqueName: \"kubernetes.io/projected/969c667a-e499-4c6a-9da3-d7813886c794-kube-api-access-jh2d8\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4fb\" (UniqueName: \"kubernetes.io/projected/08cd58a8-ee9e-44a8-874f-2187733e6d57-kube-api-access-dq4fb\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058215 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-reloader\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058238 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058262 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/08cd58a8-ee9e-44a8-874f-2187733e6d57-metallb-excludel2\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058278 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-metrics-certs\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058311 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-frr-conf\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058332 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-metrics\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058353 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-frr-sockets\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058711 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-frr-conf\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058804 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-frr-sockets\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058895 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-reloader\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.058912 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/969c667a-e499-4c6a-9da3-d7813886c794-metrics\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.059373 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/969c667a-e499-4c6a-9da3-d7813886c794-frr-startup\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.061116 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/969c667a-e499-4c6a-9da3-d7813886c794-metrics-certs\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.072504 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2d8\" (UniqueName: \"kubernetes.io/projected/969c667a-e499-4c6a-9da3-d7813886c794-kube-api-access-jh2d8\") pod \"frr-k8s-mvhbg\" (UID: \"969c667a-e499-4c6a-9da3-d7813886c794\") " pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.159624 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-cert\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.159675 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8rcw\" (UniqueName: \"kubernetes.io/projected/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-kube-api-access-f8rcw\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.159709 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-metrics-certs\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.159745 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4fb\" (UniqueName: \"kubernetes.io/projected/08cd58a8-ee9e-44a8-874f-2187733e6d57-kube-api-access-dq4fb\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.159785 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.159814 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/08cd58a8-ee9e-44a8-874f-2187733e6d57-metallb-excludel2\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.159834 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-metrics-certs\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: E0311 09:29:46.159932 4830 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 09:29:46 crc kubenswrapper[4830]: E0311 09:29:46.159994 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist podName:08cd58a8-ee9e-44a8-874f-2187733e6d57 nodeName:}" failed. No retries permitted until 2026-03-11 09:29:46.659975022 +0000 UTC m=+954.441125711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist") pod "speaker-gmrbg" (UID: "08cd58a8-ee9e-44a8-874f-2187733e6d57") : secret "metallb-memberlist" not found Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.160641 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/08cd58a8-ee9e-44a8-874f-2187733e6d57-metallb-excludel2\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.163451 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-cert\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.163482 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-metrics-certs\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.164009 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-metrics-certs\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.168288 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.177635 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4fb\" (UniqueName: \"kubernetes.io/projected/08cd58a8-ee9e-44a8-874f-2187733e6d57-kube-api-access-dq4fb\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.177721 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8rcw\" (UniqueName: \"kubernetes.io/projected/29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be-kube-api-access-f8rcw\") pod \"controller-7bb4cc7c98-qg75l\" (UID: \"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be\") " pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.245134 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.294618 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.441449 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerStarted","Data":"fd0eca06f6874ea116edf2e1bdad07cf0548a0054995aa235c650b1525af8b60"} Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.463109 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-svpm2\" (UID: \"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.487695 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd1c2f0c-c126-4cd9-863d-6ec94f3920ba-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-svpm2\" (UID: \"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.628530 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qg75l"] Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.666662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:46 crc kubenswrapper[4830]: E0311 09:29:46.666837 4830 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 11 09:29:46 crc kubenswrapper[4830]: E0311 09:29:46.666923 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist podName:08cd58a8-ee9e-44a8-874f-2187733e6d57 nodeName:}" failed. No retries permitted until 2026-03-11 09:29:47.666904896 +0000 UTC m=+955.448055585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist") pod "speaker-gmrbg" (UID: "08cd58a8-ee9e-44a8-874f-2187733e6d57") : secret "metallb-memberlist" not found Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.752109 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:46 crc kubenswrapper[4830]: I0311 09:29:46.942712 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2"] Mar 11 09:29:46 crc kubenswrapper[4830]: W0311 09:29:46.943608 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd1c2f0c_c126_4cd9_863d_6ec94f3920ba.slice/crio-18899724b3c984d5820c87dd4ed13ba04f7a1f4922535b3a4e10e90f70c05810 WatchSource:0}: Error finding container 18899724b3c984d5820c87dd4ed13ba04f7a1f4922535b3a4e10e90f70c05810: Status 404 returned error can't find the container with id 18899724b3c984d5820c87dd4ed13ba04f7a1f4922535b3a4e10e90f70c05810 Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.448238 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" event={"ID":"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba","Type":"ContainerStarted","Data":"18899724b3c984d5820c87dd4ed13ba04f7a1f4922535b3a4e10e90f70c05810"} Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.450223 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qg75l" event={"ID":"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be","Type":"ContainerStarted","Data":"6f719aa16a0dd37b68c1f662ac72e90a9aff65ff8c0f34f4b97f7adc80565517"} Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.450254 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qg75l" event={"ID":"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be","Type":"ContainerStarted","Data":"9ed704639976e26f815faf7f5c4a16b9b88c9fd8317a28c082759a1a94758fc6"} Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.450268 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qg75l" event={"ID":"29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be","Type":"ContainerStarted","Data":"f318fe8e534a059265316514378f50f5b817b862da0925c67c10a9924eb43e5b"} Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.450359 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.469031 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-qg75l" podStartSLOduration=2.468992905 podStartE2EDuration="2.468992905s" podCreationTimestamp="2026-03-11 09:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:29:47.466670321 +0000 UTC m=+955.247821020" watchObservedRunningTime="2026-03-11 09:29:47.468992905 +0000 UTC m=+955.250143614" Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.680777 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.686417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/08cd58a8-ee9e-44a8-874f-2187733e6d57-memberlist\") pod \"speaker-gmrbg\" (UID: \"08cd58a8-ee9e-44a8-874f-2187733e6d57\") " pod="metallb-system/speaker-gmrbg" Mar 11 09:29:47 crc kubenswrapper[4830]: I0311 09:29:47.722279 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gmrbg" Mar 11 09:29:48 crc kubenswrapper[4830]: I0311 09:29:48.465074 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gmrbg" event={"ID":"08cd58a8-ee9e-44a8-874f-2187733e6d57","Type":"ContainerStarted","Data":"c9bd4ac9c3b147a5e7fc3223e9b8402d9355fccd7f8e1d5d40ad4b90b095ca76"} Mar 11 09:29:48 crc kubenswrapper[4830]: I0311 09:29:48.465513 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gmrbg" event={"ID":"08cd58a8-ee9e-44a8-874f-2187733e6d57","Type":"ContainerStarted","Data":"bf808ed58f104b79b6d66fb6126fbeae14c89a4fe5ed05eade0810f58ad2f7e4"} Mar 11 09:29:49 crc kubenswrapper[4830]: I0311 09:29:49.472636 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gmrbg" event={"ID":"08cd58a8-ee9e-44a8-874f-2187733e6d57","Type":"ContainerStarted","Data":"5d53b1e88c87c244cd8960a8a3903cf31f77592ffeee0b060748c954f15c1988"} Mar 11 09:29:49 crc kubenswrapper[4830]: I0311 09:29:49.472783 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gmrbg" Mar 11 09:29:49 crc kubenswrapper[4830]: I0311 09:29:49.491291 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gmrbg" podStartSLOduration=4.491271615 podStartE2EDuration="4.491271615s" podCreationTimestamp="2026-03-11 09:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:29:49.490262717 +0000 UTC m=+957.271413406" watchObservedRunningTime="2026-03-11 09:29:49.491271615 +0000 UTC m=+957.272422304" Mar 11 09:29:54 crc kubenswrapper[4830]: I0311 09:29:54.546678 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" event={"ID":"bd1c2f0c-c126-4cd9-863d-6ec94f3920ba","Type":"ContainerStarted","Data":"4d580752c3467390ffb757316649ecde44cdc112cecb2a1170eefaafc73836e9"} Mar 11 09:29:54 crc kubenswrapper[4830]: I0311 09:29:54.547058 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:29:54 crc kubenswrapper[4830]: I0311 09:29:54.548709 4830 generic.go:334] "Generic (PLEG): container finished" podID="969c667a-e499-4c6a-9da3-d7813886c794" containerID="b1b78008b8b363dd23b32c377d1b5233ad01081d99cd9e1e5e62ee2b2a3b029a" exitCode=0 Mar 11 09:29:54 crc kubenswrapper[4830]: I0311 09:29:54.548750 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerDied","Data":"b1b78008b8b363dd23b32c377d1b5233ad01081d99cd9e1e5e62ee2b2a3b029a"} Mar 11 09:29:54 crc kubenswrapper[4830]: I0311 09:29:54.568935 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" podStartSLOduration=2.6067043979999998 podStartE2EDuration="9.568908595s" podCreationTimestamp="2026-03-11 09:29:45 +0000 UTC" firstStartedPulling="2026-03-11 09:29:46.946321314 +0000 UTC m=+954.727472023" lastFinishedPulling="2026-03-11 09:29:53.908525531 +0000 UTC m=+961.689676220" observedRunningTime="2026-03-11 09:29:54.564161083 +0000 UTC m=+962.345311802" watchObservedRunningTime="2026-03-11 09:29:54.568908595 +0000 UTC m=+962.350059304" Mar 11 09:29:55 crc kubenswrapper[4830]: I0311 09:29:55.570724 4830 generic.go:334] "Generic (PLEG): container finished" podID="969c667a-e499-4c6a-9da3-d7813886c794" containerID="ebb45539c91ea7053535082fc1e29dcdfdf3c8a9734f7c9b12ae3b5cea22230e" exitCode=0 Mar 11 09:29:55 crc kubenswrapper[4830]: I0311 09:29:55.570842 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerDied","Data":"ebb45539c91ea7053535082fc1e29dcdfdf3c8a9734f7c9b12ae3b5cea22230e"} Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.248608 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-qg75l" Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.578374 4830 generic.go:334] "Generic (PLEG): container finished" podID="969c667a-e499-4c6a-9da3-d7813886c794" containerID="1f91d3846f3fc7dbcbfc26956db74a23f7a2012e8c9a9917020a0c8dfe2395d5" exitCode=0 Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.578424 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerDied","Data":"1f91d3846f3fc7dbcbfc26956db74a23f7a2012e8c9a9917020a0c8dfe2395d5"} Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.628647 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xsbxx"] Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.630385 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.653113 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsbxx"] Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.800433 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-catalog-content\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.800486 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh6nh\" (UniqueName: \"kubernetes.io/projected/de6d6702-413c-41dc-bea4-b4c8c6d840a9-kube-api-access-lh6nh\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:56 crc kubenswrapper[4830]: I0311 09:29:56.800534 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-utilities\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:56.902057 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-catalog-content\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:56.902110 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh6nh\" (UniqueName: \"kubernetes.io/projected/de6d6702-413c-41dc-bea4-b4c8c6d840a9-kube-api-access-lh6nh\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:56.902152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-utilities\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:56.902674 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-catalog-content\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:56.902833 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-utilities\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:56.930798 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh6nh\" (UniqueName: \"kubernetes.io/projected/de6d6702-413c-41dc-bea4-b4c8c6d840a9-kube-api-access-lh6nh\") pod \"community-operators-xsbxx\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:56.993706 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:57.440396 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsbxx"] Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:57.594830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsbxx" event={"ID":"de6d6702-413c-41dc-bea4-b4c8c6d840a9","Type":"ContainerStarted","Data":"6c41f67306992e0ea784af0ba1091ab380c534e43f6557090f0d35d64cfa2313"} Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:57.612387 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerStarted","Data":"1646bcb5124e91bf345adf7e1f88f366cad915d3d8d0647a3c80e82037621059"} Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:57.612445 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerStarted","Data":"9daeecf4699c89e1ea1b707b9d4a26e8e5a8735404f8b520cb112de097fb3695"} Mar 11 09:29:57 crc kubenswrapper[4830]: I0311 09:29:57.612457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerStarted","Data":"320a620c5475410106230176310fbf31933b9fb423f082369292c15cdabbb2b4"} Mar 11 09:29:58 crc kubenswrapper[4830]: I0311 09:29:58.618373 4830 generic.go:334] "Generic (PLEG): container finished" podID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerID="51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2" exitCode=0 Mar 11 09:29:58 crc kubenswrapper[4830]: I0311 09:29:58.618462 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsbxx" event={"ID":"de6d6702-413c-41dc-bea4-b4c8c6d840a9","Type":"ContainerDied","Data":"51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2"} Mar 11 09:29:58 crc kubenswrapper[4830]: I0311 09:29:58.623432 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerStarted","Data":"0e1f35963891763f18e91ab04d09d6e072720fabbd4e24211e991d4d96eb9b0d"} Mar 11 09:29:58 crc kubenswrapper[4830]: I0311 09:29:58.623468 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerStarted","Data":"e6da5226ddeec70e5fc1f01273437513e7c1a52ec5afb07b112cf93688895f2b"} Mar 11 09:29:58 crc kubenswrapper[4830]: I0311 09:29:58.623477 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mvhbg" event={"ID":"969c667a-e499-4c6a-9da3-d7813886c794","Type":"ContainerStarted","Data":"373d1f84f532bb0cb8353ea25c0d130166d73f1a7fb89a95c3012bdcf3d25127"} Mar 11 09:29:58 crc kubenswrapper[4830]: I0311 09:29:58.623909 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:29:58 crc kubenswrapper[4830]: I0311 09:29:58.662887 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mvhbg" podStartSLOduration=6.066421375 podStartE2EDuration="13.66287102s" podCreationTimestamp="2026-03-11 09:29:45 +0000 UTC" firstStartedPulling="2026-03-11 09:29:46.294359523 +0000 UTC m=+954.075510212" lastFinishedPulling="2026-03-11 09:29:53.890809168 +0000 UTC m=+961.671959857" observedRunningTime="2026-03-11 09:29:58.65820231 +0000 UTC m=+966.439353019" watchObservedRunningTime="2026-03-11 09:29:58.66287102 +0000 UTC m=+966.444021709" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.135187 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553690-86dps"] Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.136077 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553690-86dps" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.138154 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.139756 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.139772 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.144489 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553690-86dps"] Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.162506 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpwwk\" (UniqueName: \"kubernetes.io/projected/adb8fb2d-321b-4489-92c2-5a314ae41dbf-kube-api-access-tpwwk\") pod \"auto-csr-approver-29553690-86dps\" (UID: \"adb8fb2d-321b-4489-92c2-5a314ae41dbf\") " pod="openshift-infra/auto-csr-approver-29553690-86dps" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.240712 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg"] Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.241667 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.244548 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.244976 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.250602 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg"] Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.263417 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpwwk\" (UniqueName: \"kubernetes.io/projected/adb8fb2d-321b-4489-92c2-5a314ae41dbf-kube-api-access-tpwwk\") pod \"auto-csr-approver-29553690-86dps\" (UID: \"adb8fb2d-321b-4489-92c2-5a314ae41dbf\") " pod="openshift-infra/auto-csr-approver-29553690-86dps" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.286512 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpwwk\" (UniqueName: \"kubernetes.io/projected/adb8fb2d-321b-4489-92c2-5a314ae41dbf-kube-api-access-tpwwk\") pod \"auto-csr-approver-29553690-86dps\" (UID: \"adb8fb2d-321b-4489-92c2-5a314ae41dbf\") " pod="openshift-infra/auto-csr-approver-29553690-86dps" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.365325 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqn5\" (UniqueName: \"kubernetes.io/projected/a48921cf-d963-44f0-85ac-224081bf9848-kube-api-access-5rqn5\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.365634 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a48921cf-d963-44f0-85ac-224081bf9848-config-volume\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.365787 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a48921cf-d963-44f0-85ac-224081bf9848-secret-volume\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.451422 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553690-86dps" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.466932 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a48921cf-d963-44f0-85ac-224081bf9848-config-volume\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.467046 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a48921cf-d963-44f0-85ac-224081bf9848-secret-volume\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.467132 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqn5\" (UniqueName: \"kubernetes.io/projected/a48921cf-d963-44f0-85ac-224081bf9848-kube-api-access-5rqn5\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.467903 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a48921cf-d963-44f0-85ac-224081bf9848-config-volume\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.472573 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a48921cf-d963-44f0-85ac-224081bf9848-secret-volume\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.486639 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqn5\" (UniqueName: \"kubernetes.io/projected/a48921cf-d963-44f0-85ac-224081bf9848-kube-api-access-5rqn5\") pod \"collect-profiles-29553690-6cbfg\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.555188 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.649345 4830 generic.go:334] "Generic (PLEG): container finished" podID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerID="b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372" exitCode=0 Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.649631 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsbxx" event={"ID":"de6d6702-413c-41dc-bea4-b4c8c6d840a9","Type":"ContainerDied","Data":"b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372"} Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.780492 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg"] Mar 11 09:30:00 crc kubenswrapper[4830]: W0311 09:30:00.803241 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda48921cf_d963_44f0_85ac_224081bf9848.slice/crio-c5007e814de453b4e773677d6b3448a5fcbae9d3753757b0878cc761880a85a4 WatchSource:0}: Error finding container c5007e814de453b4e773677d6b3448a5fcbae9d3753757b0878cc761880a85a4: Status 404 returned error can't find the container with id c5007e814de453b4e773677d6b3448a5fcbae9d3753757b0878cc761880a85a4 Mar 11 09:30:00 crc kubenswrapper[4830]: I0311 09:30:00.868953 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553690-86dps"] Mar 11 09:30:00 crc kubenswrapper[4830]: W0311 09:30:00.874317 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadb8fb2d_321b_4489_92c2_5a314ae41dbf.slice/crio-ae9d92bddbced159e0d0e9b9c9177e7cb3be8af76b4813c38562af6f5d06e402 WatchSource:0}: Error finding container ae9d92bddbced159e0d0e9b9c9177e7cb3be8af76b4813c38562af6f5d06e402: Status 404 returned error can't find the container with id ae9d92bddbced159e0d0e9b9c9177e7cb3be8af76b4813c38562af6f5d06e402 Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.169071 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.208124 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.655299 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553690-86dps" event={"ID":"adb8fb2d-321b-4489-92c2-5a314ae41dbf","Type":"ContainerStarted","Data":"ae9d92bddbced159e0d0e9b9c9177e7cb3be8af76b4813c38562af6f5d06e402"} Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.657336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsbxx" event={"ID":"de6d6702-413c-41dc-bea4-b4c8c6d840a9","Type":"ContainerStarted","Data":"b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22"} Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.658999 4830 generic.go:334] "Generic (PLEG): container finished" podID="a48921cf-d963-44f0-85ac-224081bf9848" containerID="5c1c38b6c7344cc9d9855a8ca2f10df747870f247e36fe59c489d0a1808a6480" exitCode=0 Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.659097 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" event={"ID":"a48921cf-d963-44f0-85ac-224081bf9848","Type":"ContainerDied","Data":"5c1c38b6c7344cc9d9855a8ca2f10df747870f247e36fe59c489d0a1808a6480"} Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.659139 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" event={"ID":"a48921cf-d963-44f0-85ac-224081bf9848","Type":"ContainerStarted","Data":"c5007e814de453b4e773677d6b3448a5fcbae9d3753757b0878cc761880a85a4"} Mar 11 09:30:01 crc kubenswrapper[4830]: I0311 09:30:01.676297 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xsbxx" podStartSLOduration=2.963724356 podStartE2EDuration="5.676277303s" podCreationTimestamp="2026-03-11 09:29:56 +0000 UTC" firstStartedPulling="2026-03-11 09:29:58.620306805 +0000 UTC m=+966.401457494" lastFinishedPulling="2026-03-11 09:30:01.332859752 +0000 UTC m=+969.114010441" observedRunningTime="2026-03-11 09:30:01.672188279 +0000 UTC m=+969.453338968" watchObservedRunningTime="2026-03-11 09:30:01.676277303 +0000 UTC m=+969.457427992" Mar 11 09:30:02 crc kubenswrapper[4830]: I0311 09:30:02.919382 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.110461 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rqn5\" (UniqueName: \"kubernetes.io/projected/a48921cf-d963-44f0-85ac-224081bf9848-kube-api-access-5rqn5\") pod \"a48921cf-d963-44f0-85ac-224081bf9848\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.110515 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a48921cf-d963-44f0-85ac-224081bf9848-config-volume\") pod \"a48921cf-d963-44f0-85ac-224081bf9848\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.110653 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a48921cf-d963-44f0-85ac-224081bf9848-secret-volume\") pod \"a48921cf-d963-44f0-85ac-224081bf9848\" (UID: \"a48921cf-d963-44f0-85ac-224081bf9848\") " Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.110986 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a48921cf-d963-44f0-85ac-224081bf9848-config-volume" (OuterVolumeSpecName: "config-volume") pod "a48921cf-d963-44f0-85ac-224081bf9848" (UID: "a48921cf-d963-44f0-85ac-224081bf9848"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.115997 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a48921cf-d963-44f0-85ac-224081bf9848-kube-api-access-5rqn5" (OuterVolumeSpecName: "kube-api-access-5rqn5") pod "a48921cf-d963-44f0-85ac-224081bf9848" (UID: "a48921cf-d963-44f0-85ac-224081bf9848"). InnerVolumeSpecName "kube-api-access-5rqn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.116432 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a48921cf-d963-44f0-85ac-224081bf9848-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a48921cf-d963-44f0-85ac-224081bf9848" (UID: "a48921cf-d963-44f0-85ac-224081bf9848"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.212539 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rqn5\" (UniqueName: \"kubernetes.io/projected/a48921cf-d963-44f0-85ac-224081bf9848-kube-api-access-5rqn5\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.212573 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a48921cf-d963-44f0-85ac-224081bf9848-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.212582 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a48921cf-d963-44f0-85ac-224081bf9848-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.673041 4830 generic.go:334] "Generic (PLEG): container finished" podID="adb8fb2d-321b-4489-92c2-5a314ae41dbf" containerID="1082e88df298d5491c45b4e8c99372bf2065e69c892cb2ae2af60456ef07cb32" exitCode=0 Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.673119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553690-86dps" event={"ID":"adb8fb2d-321b-4489-92c2-5a314ae41dbf","Type":"ContainerDied","Data":"1082e88df298d5491c45b4e8c99372bf2065e69c892cb2ae2af60456ef07cb32"} Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.674885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" event={"ID":"a48921cf-d963-44f0-85ac-224081bf9848","Type":"ContainerDied","Data":"c5007e814de453b4e773677d6b3448a5fcbae9d3753757b0878cc761880a85a4"} Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.674923 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5007e814de453b4e773677d6b3448a5fcbae9d3753757b0878cc761880a85a4" Mar 11 09:30:03 crc kubenswrapper[4830]: I0311 09:30:03.674939 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg" Mar 11 09:30:04 crc kubenswrapper[4830]: I0311 09:30:04.966629 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553690-86dps" Mar 11 09:30:05 crc kubenswrapper[4830]: I0311 09:30:05.137955 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpwwk\" (UniqueName: \"kubernetes.io/projected/adb8fb2d-321b-4489-92c2-5a314ae41dbf-kube-api-access-tpwwk\") pod \"adb8fb2d-321b-4489-92c2-5a314ae41dbf\" (UID: \"adb8fb2d-321b-4489-92c2-5a314ae41dbf\") " Mar 11 09:30:05 crc kubenswrapper[4830]: I0311 09:30:05.143224 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb8fb2d-321b-4489-92c2-5a314ae41dbf-kube-api-access-tpwwk" (OuterVolumeSpecName: "kube-api-access-tpwwk") pod "adb8fb2d-321b-4489-92c2-5a314ae41dbf" (UID: "adb8fb2d-321b-4489-92c2-5a314ae41dbf"). InnerVolumeSpecName "kube-api-access-tpwwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:30:05 crc kubenswrapper[4830]: I0311 09:30:05.239954 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpwwk\" (UniqueName: \"kubernetes.io/projected/adb8fb2d-321b-4489-92c2-5a314ae41dbf-kube-api-access-tpwwk\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:05 crc kubenswrapper[4830]: I0311 09:30:05.688726 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553690-86dps" event={"ID":"adb8fb2d-321b-4489-92c2-5a314ae41dbf","Type":"ContainerDied","Data":"ae9d92bddbced159e0d0e9b9c9177e7cb3be8af76b4813c38562af6f5d06e402"} Mar 11 09:30:05 crc kubenswrapper[4830]: I0311 09:30:05.688776 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae9d92bddbced159e0d0e9b9c9177e7cb3be8af76b4813c38562af6f5d06e402" Mar 11 09:30:05 crc kubenswrapper[4830]: I0311 09:30:05.688793 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553690-86dps" Mar 11 09:30:05 crc kubenswrapper[4830]: E0311 09:30:05.782489 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadb8fb2d_321b_4489_92c2_5a314ae41dbf.slice\": RecentStats: unable to find data in memory cache]" Mar 11 09:30:06 crc kubenswrapper[4830]: I0311 09:30:06.023429 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553684-clgcq"] Mar 11 09:30:06 crc kubenswrapper[4830]: I0311 09:30:06.027513 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553684-clgcq"] Mar 11 09:30:06 crc kubenswrapper[4830]: I0311 09:30:06.171193 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mvhbg" Mar 11 09:30:06 crc kubenswrapper[4830]: I0311 09:30:06.757709 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-svpm2" Mar 11 09:30:06 crc kubenswrapper[4830]: I0311 09:30:06.947078 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a16aaf-f956-485d-ba92-0bc09cd6af26" path="/var/lib/kubelet/pods/23a16aaf-f956-485d-ba92-0bc09cd6af26/volumes" Mar 11 09:30:06 crc kubenswrapper[4830]: I0311 09:30:06.994709 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:30:06 crc kubenswrapper[4830]: I0311 09:30:06.994775 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:30:07 crc kubenswrapper[4830]: I0311 09:30:07.049343 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:30:07 crc kubenswrapper[4830]: I0311 09:30:07.725188 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gmrbg" Mar 11 09:30:07 crc kubenswrapper[4830]: I0311 09:30:07.749605 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:30:07 crc kubenswrapper[4830]: I0311 09:30:07.817755 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsbxx"] Mar 11 09:30:09 crc kubenswrapper[4830]: I0311 09:30:09.709486 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xsbxx" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="registry-server" containerID="cri-o://b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22" gracePeriod=2 Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.055650 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.100002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh6nh\" (UniqueName: \"kubernetes.io/projected/de6d6702-413c-41dc-bea4-b4c8c6d840a9-kube-api-access-lh6nh\") pod \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.100832 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-catalog-content\") pod \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.100897 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-utilities\") pod \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\" (UID: \"de6d6702-413c-41dc-bea4-b4c8c6d840a9\") " Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.101629 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-utilities" (OuterVolumeSpecName: "utilities") pod "de6d6702-413c-41dc-bea4-b4c8c6d840a9" (UID: "de6d6702-413c-41dc-bea4-b4c8c6d840a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.116922 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6d6702-413c-41dc-bea4-b4c8c6d840a9-kube-api-access-lh6nh" (OuterVolumeSpecName: "kube-api-access-lh6nh") pod "de6d6702-413c-41dc-bea4-b4c8c6d840a9" (UID: "de6d6702-413c-41dc-bea4-b4c8c6d840a9"). InnerVolumeSpecName "kube-api-access-lh6nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.159574 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de6d6702-413c-41dc-bea4-b4c8c6d840a9" (UID: "de6d6702-413c-41dc-bea4-b4c8c6d840a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.201660 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.201692 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh6nh\" (UniqueName: \"kubernetes.io/projected/de6d6702-413c-41dc-bea4-b4c8c6d840a9-kube-api-access-lh6nh\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.201703 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de6d6702-413c-41dc-bea4-b4c8c6d840a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515377 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vltbq"] Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.515649 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="registry-server" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515667 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="registry-server" Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.515684 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb8fb2d-321b-4489-92c2-5a314ae41dbf" containerName="oc" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515691 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb8fb2d-321b-4489-92c2-5a314ae41dbf" containerName="oc" Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.515707 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="extract-utilities" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515714 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="extract-utilities" Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.515727 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="extract-content" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515734 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="extract-content" Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.515742 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48921cf-d963-44f0-85ac-224081bf9848" containerName="collect-profiles" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515750 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48921cf-d963-44f0-85ac-224081bf9848" containerName="collect-profiles" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515879 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerName="registry-server" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515898 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb8fb2d-321b-4489-92c2-5a314ae41dbf" containerName="oc" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.515910 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a48921cf-d963-44f0-85ac-224081bf9848" containerName="collect-profiles" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.516416 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vltbq" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.519384 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.519519 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.519690 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kkbrz" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.528306 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vltbq"] Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.709159 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8hqt\" (UniqueName: \"kubernetes.io/projected/c34ca01e-6cd6-4bca-b90b-b533dafc509f-kube-api-access-p8hqt\") pod \"openstack-operator-index-vltbq\" (UID: \"c34ca01e-6cd6-4bca-b90b-b533dafc509f\") " pod="openstack-operators/openstack-operator-index-vltbq" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.717116 4830 generic.go:334] "Generic (PLEG): container finished" podID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" containerID="b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22" exitCode=0 Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.717167 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsbxx" event={"ID":"de6d6702-413c-41dc-bea4-b4c8c6d840a9","Type":"ContainerDied","Data":"b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22"} Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.717191 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsbxx" event={"ID":"de6d6702-413c-41dc-bea4-b4c8c6d840a9","Type":"ContainerDied","Data":"6c41f67306992e0ea784af0ba1091ab380c534e43f6557090f0d35d64cfa2313"} Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.717207 4830 scope.go:117] "RemoveContainer" containerID="b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.717335 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsbxx" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.740254 4830 scope.go:117] "RemoveContainer" containerID="b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.750204 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsbxx"] Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.762547 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xsbxx"] Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.780560 4830 scope.go:117] "RemoveContainer" containerID="51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.803719 4830 scope.go:117] "RemoveContainer" containerID="b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22" Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.804254 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22\": container with ID starting with b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22 not found: ID does not exist" containerID="b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.804320 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22"} err="failed to get container status \"b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22\": rpc error: code = NotFound desc = could not find container \"b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22\": container with ID starting with b547cd9575b112b86ebf3db37ccd74a40301904142a33a347bb1f3679931dd22 not found: ID does not exist" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.804412 4830 scope.go:117] "RemoveContainer" containerID="b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372" Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.804745 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372\": container with ID starting with b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372 not found: ID does not exist" containerID="b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.804780 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372"} err="failed to get container status \"b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372\": rpc error: code = NotFound desc = could not find container \"b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372\": container with ID starting with b5e8cf9aad711aa9cbe1fb46c8c5226e69d37cc3f6c382e53c320cef11134372 not found: ID does not exist" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.804803 4830 scope.go:117] "RemoveContainer" containerID="51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2" Mar 11 09:30:10 crc kubenswrapper[4830]: E0311 09:30:10.805079 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2\": container with ID starting with 51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2 not found: ID does not exist" containerID="51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.805102 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2"} err="failed to get container status \"51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2\": rpc error: code = NotFound desc = could not find container \"51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2\": container with ID starting with 51a0404efc14429f47036d5ebda33ab07c517c0541d45231453eccca427277b2 not found: ID does not exist" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.810535 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8hqt\" (UniqueName: \"kubernetes.io/projected/c34ca01e-6cd6-4bca-b90b-b533dafc509f-kube-api-access-p8hqt\") pod \"openstack-operator-index-vltbq\" (UID: \"c34ca01e-6cd6-4bca-b90b-b533dafc509f\") " pod="openstack-operators/openstack-operator-index-vltbq" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.830859 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8hqt\" (UniqueName: \"kubernetes.io/projected/c34ca01e-6cd6-4bca-b90b-b533dafc509f-kube-api-access-p8hqt\") pod \"openstack-operator-index-vltbq\" (UID: \"c34ca01e-6cd6-4bca-b90b-b533dafc509f\") " pod="openstack-operators/openstack-operator-index-vltbq" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.833598 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vltbq" Mar 11 09:30:10 crc kubenswrapper[4830]: I0311 09:30:10.942253 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6d6702-413c-41dc-bea4-b4c8c6d840a9" path="/var/lib/kubelet/pods/de6d6702-413c-41dc-bea4-b4c8c6d840a9/volumes" Mar 11 09:30:11 crc kubenswrapper[4830]: I0311 09:30:11.291569 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vltbq"] Mar 11 09:30:11 crc kubenswrapper[4830]: I0311 09:30:11.723672 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vltbq" event={"ID":"c34ca01e-6cd6-4bca-b90b-b533dafc509f","Type":"ContainerStarted","Data":"87a90a1e203e85a250822507900fe05ebaf2646ecec5309fe36815a2a2f8ee06"} Mar 11 09:30:13 crc kubenswrapper[4830]: I0311 09:30:13.060320 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:30:13 crc kubenswrapper[4830]: I0311 09:30:13.060641 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:30:15 crc kubenswrapper[4830]: I0311 09:30:15.879603 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vltbq"] Mar 11 09:30:16 crc kubenswrapper[4830]: I0311 09:30:16.694918 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6wtvn"] Mar 11 09:30:16 crc kubenswrapper[4830]: I0311 09:30:16.696204 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:16 crc kubenswrapper[4830]: I0311 09:30:16.700603 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6wtvn"] Mar 11 09:30:16 crc kubenswrapper[4830]: I0311 09:30:16.891870 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjlc\" (UniqueName: \"kubernetes.io/projected/fdd7a557-69d6-4baf-89e5-a8bf6219aaa0-kube-api-access-9sjlc\") pod \"openstack-operator-index-6wtvn\" (UID: \"fdd7a557-69d6-4baf-89e5-a8bf6219aaa0\") " pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:16 crc kubenswrapper[4830]: I0311 09:30:16.993799 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjlc\" (UniqueName: \"kubernetes.io/projected/fdd7a557-69d6-4baf-89e5-a8bf6219aaa0-kube-api-access-9sjlc\") pod \"openstack-operator-index-6wtvn\" (UID: \"fdd7a557-69d6-4baf-89e5-a8bf6219aaa0\") " pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:17 crc kubenswrapper[4830]: I0311 09:30:17.013855 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjlc\" (UniqueName: \"kubernetes.io/projected/fdd7a557-69d6-4baf-89e5-a8bf6219aaa0-kube-api-access-9sjlc\") pod \"openstack-operator-index-6wtvn\" (UID: \"fdd7a557-69d6-4baf-89e5-a8bf6219aaa0\") " pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:17 crc kubenswrapper[4830]: I0311 09:30:17.021423 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:17 crc kubenswrapper[4830]: I0311 09:30:17.449317 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6wtvn"] Mar 11 09:30:17 crc kubenswrapper[4830]: W0311 09:30:17.457532 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd7a557_69d6_4baf_89e5_a8bf6219aaa0.slice/crio-906dfece91f63afb29094cf39a42f57b2ab9411ce1d6573e1296097d5eda62f4 WatchSource:0}: Error finding container 906dfece91f63afb29094cf39a42f57b2ab9411ce1d6573e1296097d5eda62f4: Status 404 returned error can't find the container with id 906dfece91f63afb29094cf39a42f57b2ab9411ce1d6573e1296097d5eda62f4 Mar 11 09:30:17 crc kubenswrapper[4830]: I0311 09:30:17.765867 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vltbq" event={"ID":"c34ca01e-6cd6-4bca-b90b-b533dafc509f","Type":"ContainerStarted","Data":"ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c"} Mar 11 09:30:17 crc kubenswrapper[4830]: I0311 09:30:17.765923 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-vltbq" podUID="c34ca01e-6cd6-4bca-b90b-b533dafc509f" containerName="registry-server" containerID="cri-o://ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c" gracePeriod=2 Mar 11 09:30:17 crc kubenswrapper[4830]: I0311 09:30:17.768886 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6wtvn" event={"ID":"fdd7a557-69d6-4baf-89e5-a8bf6219aaa0","Type":"ContainerStarted","Data":"906dfece91f63afb29094cf39a42f57b2ab9411ce1d6573e1296097d5eda62f4"} Mar 11 09:30:17 crc kubenswrapper[4830]: I0311 09:30:17.788323 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vltbq" podStartSLOduration=1.878982861 podStartE2EDuration="7.788295101s" podCreationTimestamp="2026-03-11 09:30:10 +0000 UTC" firstStartedPulling="2026-03-11 09:30:11.293297867 +0000 UTC m=+979.074448556" lastFinishedPulling="2026-03-11 09:30:17.202610097 +0000 UTC m=+984.983760796" observedRunningTime="2026-03-11 09:30:17.781957475 +0000 UTC m=+985.563108214" watchObservedRunningTime="2026-03-11 09:30:17.788295101 +0000 UTC m=+985.569445790" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.180869 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vltbq" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.313598 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8hqt\" (UniqueName: \"kubernetes.io/projected/c34ca01e-6cd6-4bca-b90b-b533dafc509f-kube-api-access-p8hqt\") pod \"c34ca01e-6cd6-4bca-b90b-b533dafc509f\" (UID: \"c34ca01e-6cd6-4bca-b90b-b533dafc509f\") " Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.319244 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34ca01e-6cd6-4bca-b90b-b533dafc509f-kube-api-access-p8hqt" (OuterVolumeSpecName: "kube-api-access-p8hqt") pod "c34ca01e-6cd6-4bca-b90b-b533dafc509f" (UID: "c34ca01e-6cd6-4bca-b90b-b533dafc509f"). InnerVolumeSpecName "kube-api-access-p8hqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.414758 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8hqt\" (UniqueName: \"kubernetes.io/projected/c34ca01e-6cd6-4bca-b90b-b533dafc509f-kube-api-access-p8hqt\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.777855 4830 generic.go:334] "Generic (PLEG): container finished" podID="c34ca01e-6cd6-4bca-b90b-b533dafc509f" containerID="ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c" exitCode=0 Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.777968 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vltbq" event={"ID":"c34ca01e-6cd6-4bca-b90b-b533dafc509f","Type":"ContainerDied","Data":"ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c"} Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.778011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vltbq" event={"ID":"c34ca01e-6cd6-4bca-b90b-b533dafc509f","Type":"ContainerDied","Data":"87a90a1e203e85a250822507900fe05ebaf2646ecec5309fe36815a2a2f8ee06"} Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.778079 4830 scope.go:117] "RemoveContainer" containerID="ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.778243 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vltbq" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.790564 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6wtvn" event={"ID":"fdd7a557-69d6-4baf-89e5-a8bf6219aaa0","Type":"ContainerStarted","Data":"98f1124cff435b521a71e9e3271f8562441b2ee4e6a8696eb803efd9d632dbca"} Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.813367 4830 scope.go:117] "RemoveContainer" containerID="ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c" Mar 11 09:30:18 crc kubenswrapper[4830]: E0311 09:30:18.819446 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c\": container with ID starting with ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c not found: ID does not exist" containerID="ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.819498 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c"} err="failed to get container status \"ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c\": rpc error: code = NotFound desc = could not find container \"ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c\": container with ID starting with ac25753f072c7975d0a170a159d213faef979ffd9af144984ac86069b480599c not found: ID does not exist" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.835615 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6wtvn" podStartSLOduration=2.64778217 podStartE2EDuration="2.835596485s" podCreationTimestamp="2026-03-11 09:30:16 +0000 UTC" firstStartedPulling="2026-03-11 09:30:17.461301731 +0000 UTC m=+985.242452440" lastFinishedPulling="2026-03-11 09:30:17.649116066 +0000 UTC m=+985.430266755" observedRunningTime="2026-03-11 09:30:18.834491954 +0000 UTC m=+986.615642653" watchObservedRunningTime="2026-03-11 09:30:18.835596485 +0000 UTC m=+986.616747174" Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.860294 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-vltbq"] Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.875301 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-vltbq"] Mar 11 09:30:18 crc kubenswrapper[4830]: I0311 09:30:18.941060 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34ca01e-6cd6-4bca-b90b-b533dafc509f" path="/var/lib/kubelet/pods/c34ca01e-6cd6-4bca-b90b-b533dafc509f/volumes" Mar 11 09:30:27 crc kubenswrapper[4830]: I0311 09:30:27.021874 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:27 crc kubenswrapper[4830]: I0311 09:30:27.022212 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:27 crc kubenswrapper[4830]: I0311 09:30:27.050340 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:27 crc kubenswrapper[4830]: I0311 09:30:27.882573 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-6wtvn" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.320301 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf"] Mar 11 09:30:34 crc kubenswrapper[4830]: E0311 09:30:34.321108 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34ca01e-6cd6-4bca-b90b-b533dafc509f" containerName="registry-server" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.321125 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34ca01e-6cd6-4bca-b90b-b533dafc509f" containerName="registry-server" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.321272 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34ca01e-6cd6-4bca-b90b-b533dafc509f" containerName="registry-server" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.322230 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.324911 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-47ttf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.333241 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf"] Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.434220 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc62q\" (UniqueName: \"kubernetes.io/projected/5e7bf2dc-7510-484d-8531-fe0bf51767c3-kube-api-access-fc62q\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.434281 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-bundle\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.434340 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-util\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.536058 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-bundle\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.536181 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-util\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.536277 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc62q\" (UniqueName: \"kubernetes.io/projected/5e7bf2dc-7510-484d-8531-fe0bf51767c3-kube-api-access-fc62q\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.536499 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-bundle\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.536559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-util\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.557976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc62q\" (UniqueName: \"kubernetes.io/projected/5e7bf2dc-7510-484d-8531-fe0bf51767c3-kube-api-access-fc62q\") pod \"0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:34 crc kubenswrapper[4830]: I0311 09:30:34.640647 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:35 crc kubenswrapper[4830]: I0311 09:30:35.033419 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf"] Mar 11 09:30:35 crc kubenswrapper[4830]: I0311 09:30:35.902107 4830 generic.go:334] "Generic (PLEG): container finished" podID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerID="4aa336f82819bdfc318ee4ef41588a7bc0727d8ca5e53a65ebfcda4676c3b237" exitCode=0 Mar 11 09:30:35 crc kubenswrapper[4830]: I0311 09:30:35.902187 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" event={"ID":"5e7bf2dc-7510-484d-8531-fe0bf51767c3","Type":"ContainerDied","Data":"4aa336f82819bdfc318ee4ef41588a7bc0727d8ca5e53a65ebfcda4676c3b237"} Mar 11 09:30:35 crc kubenswrapper[4830]: I0311 09:30:35.903587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" event={"ID":"5e7bf2dc-7510-484d-8531-fe0bf51767c3","Type":"ContainerStarted","Data":"88a3a08b30523f3980c29c14844fde71f983d54fff5ea3299e31c52c27c93b33"} Mar 11 09:30:37 crc kubenswrapper[4830]: I0311 09:30:37.920906 4830 generic.go:334] "Generic (PLEG): container finished" podID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerID="2182ef666adfe5671ea6ace2d21b6009147276bf10876ff5a47a3867c431b53a" exitCode=0 Mar 11 09:30:37 crc kubenswrapper[4830]: I0311 09:30:37.922065 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" event={"ID":"5e7bf2dc-7510-484d-8531-fe0bf51767c3","Type":"ContainerDied","Data":"2182ef666adfe5671ea6ace2d21b6009147276bf10876ff5a47a3867c431b53a"} Mar 11 09:30:38 crc kubenswrapper[4830]: I0311 09:30:38.930339 4830 generic.go:334] "Generic (PLEG): container finished" podID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerID="17044f5094aca9ad8a7d7bd0378c882c8f3a0d6d68716f60a8e796f1e3fdd71a" exitCode=0 Mar 11 09:30:38 crc kubenswrapper[4830]: I0311 09:30:38.930453 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" event={"ID":"5e7bf2dc-7510-484d-8531-fe0bf51767c3","Type":"ContainerDied","Data":"17044f5094aca9ad8a7d7bd0378c882c8f3a0d6d68716f60a8e796f1e3fdd71a"} Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.223783 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.317197 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc62q\" (UniqueName: \"kubernetes.io/projected/5e7bf2dc-7510-484d-8531-fe0bf51767c3-kube-api-access-fc62q\") pod \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.317369 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-bundle\") pod \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.317394 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-util\") pod \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\" (UID: \"5e7bf2dc-7510-484d-8531-fe0bf51767c3\") " Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.318082 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-bundle" (OuterVolumeSpecName: "bundle") pod "5e7bf2dc-7510-484d-8531-fe0bf51767c3" (UID: "5e7bf2dc-7510-484d-8531-fe0bf51767c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.322535 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e7bf2dc-7510-484d-8531-fe0bf51767c3-kube-api-access-fc62q" (OuterVolumeSpecName: "kube-api-access-fc62q") pod "5e7bf2dc-7510-484d-8531-fe0bf51767c3" (UID: "5e7bf2dc-7510-484d-8531-fe0bf51767c3"). InnerVolumeSpecName "kube-api-access-fc62q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.330355 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-util" (OuterVolumeSpecName: "util") pod "5e7bf2dc-7510-484d-8531-fe0bf51767c3" (UID: "5e7bf2dc-7510-484d-8531-fe0bf51767c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.418869 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc62q\" (UniqueName: \"kubernetes.io/projected/5e7bf2dc-7510-484d-8531-fe0bf51767c3-kube-api-access-fc62q\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.418906 4830 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.418923 4830 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5e7bf2dc-7510-484d-8531-fe0bf51767c3-util\") on node \"crc\" DevicePath \"\"" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.948968 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" event={"ID":"5e7bf2dc-7510-484d-8531-fe0bf51767c3","Type":"ContainerDied","Data":"88a3a08b30523f3980c29c14844fde71f983d54fff5ea3299e31c52c27c93b33"} Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.948997 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf" Mar 11 09:30:40 crc kubenswrapper[4830]: I0311 09:30:40.949042 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a3a08b30523f3980c29c14844fde71f983d54fff5ea3299e31c52c27c93b33" Mar 11 09:30:43 crc kubenswrapper[4830]: I0311 09:30:43.060687 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:30:43 crc kubenswrapper[4830]: I0311 09:30:43.060772 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.542775 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh"] Mar 11 09:30:46 crc kubenswrapper[4830]: E0311 09:30:46.543369 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerName="extract" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.543387 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerName="extract" Mar 11 09:30:46 crc kubenswrapper[4830]: E0311 09:30:46.543401 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerName="pull" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.543408 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerName="pull" Mar 11 09:30:46 crc kubenswrapper[4830]: E0311 09:30:46.543437 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerName="util" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.543445 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerName="util" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.543577 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e7bf2dc-7510-484d-8531-fe0bf51767c3" containerName="extract" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.544142 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.545871 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-vf7c9" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.600591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngx4j\" (UniqueName: \"kubernetes.io/projected/843e8d8e-4cb5-4260-af55-147e416c0791-kube-api-access-ngx4j\") pod \"openstack-operator-controller-init-67d889964b-xg4rh\" (UID: \"843e8d8e-4cb5-4260-af55-147e416c0791\") " pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.619009 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh"] Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.701960 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngx4j\" (UniqueName: \"kubernetes.io/projected/843e8d8e-4cb5-4260-af55-147e416c0791-kube-api-access-ngx4j\") pod \"openstack-operator-controller-init-67d889964b-xg4rh\" (UID: \"843e8d8e-4cb5-4260-af55-147e416c0791\") " pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.752180 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngx4j\" (UniqueName: \"kubernetes.io/projected/843e8d8e-4cb5-4260-af55-147e416c0791-kube-api-access-ngx4j\") pod \"openstack-operator-controller-init-67d889964b-xg4rh\" (UID: \"843e8d8e-4cb5-4260-af55-147e416c0791\") " pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" Mar 11 09:30:46 crc kubenswrapper[4830]: I0311 09:30:46.860302 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" Mar 11 09:30:47 crc kubenswrapper[4830]: I0311 09:30:47.087486 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh"] Mar 11 09:30:47 crc kubenswrapper[4830]: I0311 09:30:47.998823 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" event={"ID":"843e8d8e-4cb5-4260-af55-147e416c0791","Type":"ContainerStarted","Data":"4c5cc5c1455a305afdc913a8a4e455ab396059602b8af72045cf16325fdefd33"} Mar 11 09:30:52 crc kubenswrapper[4830]: I0311 09:30:52.028999 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" event={"ID":"843e8d8e-4cb5-4260-af55-147e416c0791","Type":"ContainerStarted","Data":"ce96e663610965444d4abbf3604f7310d689333d3f5f30638c651b104b121f6f"} Mar 11 09:30:52 crc kubenswrapper[4830]: I0311 09:30:52.029532 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" Mar 11 09:30:52 crc kubenswrapper[4830]: I0311 09:30:52.060096 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" podStartSLOduration=1.913292974 podStartE2EDuration="6.060075399s" podCreationTimestamp="2026-03-11 09:30:46 +0000 UTC" firstStartedPulling="2026-03-11 09:30:47.10470434 +0000 UTC m=+1014.885855029" lastFinishedPulling="2026-03-11 09:30:51.251486765 +0000 UTC m=+1019.032637454" observedRunningTime="2026-03-11 09:30:52.053401613 +0000 UTC m=+1019.834552312" watchObservedRunningTime="2026-03-11 09:30:52.060075399 +0000 UTC m=+1019.841226098" Mar 11 09:30:56 crc kubenswrapper[4830]: I0311 09:30:56.246634 4830 scope.go:117] "RemoveContainer" containerID="07b485ff0fb5b27baeb096561a4f48acad4bb2ead35af12ab7fca4419525a027" Mar 11 09:30:56 crc kubenswrapper[4830]: I0311 09:30:56.865635 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-67d889964b-xg4rh" Mar 11 09:31:13 crc kubenswrapper[4830]: I0311 09:31:13.060357 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:31:13 crc kubenswrapper[4830]: I0311 09:31:13.060892 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:31:13 crc kubenswrapper[4830]: I0311 09:31:13.060994 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:31:13 crc kubenswrapper[4830]: I0311 09:31:13.061704 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"490e43e253d22e49dfc5c2a704ffdefb34fe709ef23f6e9173eecf22518d399e"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:31:13 crc kubenswrapper[4830]: I0311 09:31:13.061791 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://490e43e253d22e49dfc5c2a704ffdefb34fe709ef23f6e9173eecf22518d399e" gracePeriod=600 Mar 11 09:31:14 crc kubenswrapper[4830]: I0311 09:31:14.181060 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="490e43e253d22e49dfc5c2a704ffdefb34fe709ef23f6e9173eecf22518d399e" exitCode=0 Mar 11 09:31:14 crc kubenswrapper[4830]: I0311 09:31:14.181156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"490e43e253d22e49dfc5c2a704ffdefb34fe709ef23f6e9173eecf22518d399e"} Mar 11 09:31:14 crc kubenswrapper[4830]: I0311 09:31:14.181675 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"e06cb787da7fb0f6798e1465b6764e43fa4f19a8709a93ec236e4a0b85a72f7c"} Mar 11 09:31:14 crc kubenswrapper[4830]: I0311 09:31:14.181703 4830 scope.go:117] "RemoveContainer" containerID="9f1549afce8227de9820039f9dd4bcf657fcc7950e158e1064942fb283e47f6d" Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.967853 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww"] Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.969339 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.971142 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vw87h" Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.972223 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm"] Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.973007 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.974441 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bfxz8" Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.980470 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww"] Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.985883 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm"] Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.991470 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc"] Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.992367 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" Mar 11 09:31:32 crc kubenswrapper[4830]: I0311 09:31:32.995463 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-nbs4v" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.007914 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.009205 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.011939 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-m9xcb" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.030179 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.032919 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmfmn\" (UniqueName: \"kubernetes.io/projected/16121653-f66c-441b-b1e2-8cd3c1e558e4-kube-api-access-gmfmn\") pod \"designate-operator-controller-manager-66d56f6ff4-n5khc\" (UID: \"16121653-f66c-441b-b1e2-8cd3c1e558e4\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.032964 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xsg\" (UniqueName: \"kubernetes.io/projected/ceffcca8-5182-4f52-b359-e20664c1d527-kube-api-access-d2xsg\") pod \"cinder-operator-controller-manager-984cd4dcf-rjsvm\" (UID: \"ceffcca8-5182-4f52-b359-e20664c1d527\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.033005 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgwm9\" (UniqueName: \"kubernetes.io/projected/d7442149-a02a-401b-b3bd-c1d470af5b3b-kube-api-access-rgwm9\") pod \"glance-operator-controller-manager-5964f64c48-7kzdv\" (UID: \"d7442149-a02a-401b-b3bd-c1d470af5b3b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.061582 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.069243 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-xg456"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.069923 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.077140 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jq5qg" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.089994 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-xg456"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.116084 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.117497 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.120075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.120830 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.122051 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-qndkw" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.125921 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-n4mvt" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135110 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz5q\" (UniqueName: \"kubernetes.io/projected/fe16642f-b4c0-45e6-b222-83fcc2c3fb5c-kube-api-access-hcz5q\") pod \"ironic-operator-controller-manager-6bbb499bbc-mtd4g\" (UID: \"fe16642f-b4c0-45e6-b222-83fcc2c3fb5c\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7wqz\" (UniqueName: \"kubernetes.io/projected/283b8bf7-046d-4600-be30-f578a6ec3c4d-kube-api-access-m7wqz\") pod \"horizon-operator-controller-manager-6d9d6b584d-7cg7m\" (UID: \"283b8bf7-046d-4600-be30-f578a6ec3c4d\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135259 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzsk8\" (UniqueName: \"kubernetes.io/projected/3112f394-9b8e-43c2-9707-94ac1a2778db-kube-api-access-bzsk8\") pod \"heat-operator-controller-manager-77b6666d85-xg456\" (UID: \"3112f394-9b8e-43c2-9707-94ac1a2778db\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135310 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvlk\" (UniqueName: \"kubernetes.io/projected/d72c70bc-5f58-4c0f-a584-f352adf175e7-kube-api-access-kfvlk\") pod \"barbican-operator-controller-manager-677bd678f7-cxpww\" (UID: \"d72c70bc-5f58-4c0f-a584-f352adf175e7\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135303 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmfmn\" (UniqueName: \"kubernetes.io/projected/16121653-f66c-441b-b1e2-8cd3c1e558e4-kube-api-access-gmfmn\") pod \"designate-operator-controller-manager-66d56f6ff4-n5khc\" (UID: \"16121653-f66c-441b-b1e2-8cd3c1e558e4\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xsg\" (UniqueName: \"kubernetes.io/projected/ceffcca8-5182-4f52-b359-e20664c1d527-kube-api-access-d2xsg\") pod \"cinder-operator-controller-manager-984cd4dcf-rjsvm\" (UID: \"ceffcca8-5182-4f52-b359-e20664c1d527\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.135526 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgwm9\" (UniqueName: \"kubernetes.io/projected/d7442149-a02a-401b-b3bd-c1d470af5b3b-kube-api-access-rgwm9\") pod \"glance-operator-controller-manager-5964f64c48-7kzdv\" (UID: \"d7442149-a02a-401b-b3bd-c1d470af5b3b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.145206 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.158306 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.166465 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-gmxqr" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.167237 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.190105 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xsg\" (UniqueName: \"kubernetes.io/projected/ceffcca8-5182-4f52-b359-e20664c1d527-kube-api-access-d2xsg\") pod \"cinder-operator-controller-manager-984cd4dcf-rjsvm\" (UID: \"ceffcca8-5182-4f52-b359-e20664c1d527\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.198711 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgwm9\" (UniqueName: \"kubernetes.io/projected/d7442149-a02a-401b-b3bd-c1d470af5b3b-kube-api-access-rgwm9\") pod \"glance-operator-controller-manager-5964f64c48-7kzdv\" (UID: \"d7442149-a02a-401b-b3bd-c1d470af5b3b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.200872 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmfmn\" (UniqueName: \"kubernetes.io/projected/16121653-f66c-441b-b1e2-8cd3c1e558e4-kube-api-access-gmfmn\") pod \"designate-operator-controller-manager-66d56f6ff4-n5khc\" (UID: \"16121653-f66c-441b-b1e2-8cd3c1e558e4\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.201379 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.221805 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.231581 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.235412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xfbf9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.236847 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7wqz\" (UniqueName: \"kubernetes.io/projected/283b8bf7-046d-4600-be30-f578a6ec3c4d-kube-api-access-m7wqz\") pod \"horizon-operator-controller-manager-6d9d6b584d-7cg7m\" (UID: \"283b8bf7-046d-4600-be30-f578a6ec3c4d\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.236950 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzsk8\" (UniqueName: \"kubernetes.io/projected/3112f394-9b8e-43c2-9707-94ac1a2778db-kube-api-access-bzsk8\") pod \"heat-operator-controller-manager-77b6666d85-xg456\" (UID: \"3112f394-9b8e-43c2-9707-94ac1a2778db\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.238482 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.238824 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcqm\" (UniqueName: \"kubernetes.io/projected/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-kube-api-access-7rcqm\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.238867 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvlk\" (UniqueName: \"kubernetes.io/projected/d72c70bc-5f58-4c0f-a584-f352adf175e7-kube-api-access-kfvlk\") pod \"barbican-operator-controller-manager-677bd678f7-cxpww\" (UID: \"d72c70bc-5f58-4c0f-a584-f352adf175e7\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.238913 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.238957 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbp6b\" (UniqueName: \"kubernetes.io/projected/1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f-kube-api-access-sbp6b\") pod \"keystone-operator-controller-manager-684f77d66d-bcbp5\" (UID: \"1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.239129 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcz5q\" (UniqueName: \"kubernetes.io/projected/fe16642f-b4c0-45e6-b222-83fcc2c3fb5c-kube-api-access-hcz5q\") pod \"ironic-operator-controller-manager-6bbb499bbc-mtd4g\" (UID: \"fe16642f-b4c0-45e6-b222-83fcc2c3fb5c\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.258762 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.266125 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzsk8\" (UniqueName: \"kubernetes.io/projected/3112f394-9b8e-43c2-9707-94ac1a2778db-kube-api-access-bzsk8\") pod \"heat-operator-controller-manager-77b6666d85-xg456\" (UID: \"3112f394-9b8e-43c2-9707-94ac1a2778db\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.266668 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.269516 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.273398 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-ntzjg" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.277523 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7wqz\" (UniqueName: \"kubernetes.io/projected/283b8bf7-046d-4600-be30-f578a6ec3c4d-kube-api-access-m7wqz\") pod \"horizon-operator-controller-manager-6d9d6b584d-7cg7m\" (UID: \"283b8bf7-046d-4600-be30-f578a6ec3c4d\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.279791 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcz5q\" (UniqueName: \"kubernetes.io/projected/fe16642f-b4c0-45e6-b222-83fcc2c3fb5c-kube-api-access-hcz5q\") pod \"ironic-operator-controller-manager-6bbb499bbc-mtd4g\" (UID: \"fe16642f-b4c0-45e6-b222-83fcc2c3fb5c\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.289003 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.289316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvlk\" (UniqueName: \"kubernetes.io/projected/d72c70bc-5f58-4c0f-a584-f352adf175e7-kube-api-access-kfvlk\") pod \"barbican-operator-controller-manager-677bd678f7-cxpww\" (UID: \"d72c70bc-5f58-4c0f-a584-f352adf175e7\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.291124 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.297309 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.298268 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.305729 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pq9dk" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.310168 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.323456 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.324522 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.325991 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8t44f" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.332416 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.340767 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59gq\" (UniqueName: \"kubernetes.io/projected/2a85b060-3965-4d51-b568-2b360fee4c44-kube-api-access-t59gq\") pod \"manila-operator-controller-manager-68f45f9d9f-9pvn7\" (UID: \"2a85b060-3965-4d51-b568-2b360fee4c44\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.340830 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfltf\" (UniqueName: \"kubernetes.io/projected/6ee90085-fc25-4491-a2fc-9b45d5d8207a-kube-api-access-qfltf\") pod \"neutron-operator-controller-manager-776c5696bf-5gkz6\" (UID: \"6ee90085-fc25-4491-a2fc-9b45d5d8207a\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.340865 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2dxm\" (UniqueName: \"kubernetes.io/projected/c29c2a15-0eb3-41aa-b0b9-710a5ed56a87-kube-api-access-m2dxm\") pod \"mariadb-operator-controller-manager-658d4cdd5-q98j9\" (UID: \"c29c2a15-0eb3-41aa-b0b9-710a5ed56a87\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.340930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcqm\" (UniqueName: \"kubernetes.io/projected/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-kube-api-access-7rcqm\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.340969 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.341008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbp6b\" (UniqueName: \"kubernetes.io/projected/1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f-kube-api-access-sbp6b\") pod \"keystone-operator-controller-manager-684f77d66d-bcbp5\" (UID: \"1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.341292 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.341409 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert podName:74ba9d62-2d47-46a5-bd26-1a81bb0a8484 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:33.841389684 +0000 UTC m=+1061.622540373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert") pod "infra-operator-controller-manager-5995f4446f-ntcnm" (UID: "74ba9d62-2d47-46a5-bd26-1a81bb0a8484") : secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.347508 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.360809 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.360857 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcqm\" (UniqueName: \"kubernetes.io/projected/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-kube-api-access-7rcqm\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.360946 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbp6b\" (UniqueName: \"kubernetes.io/projected/1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f-kube-api-access-sbp6b\") pod \"keystone-operator-controller-manager-684f77d66d-bcbp5\" (UID: \"1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.362180 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.365456 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qwzls" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.376317 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.389115 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.393111 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.393955 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.396849 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-dcrwm" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.399710 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.402749 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.413046 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.417001 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.417828 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.420177 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.420314 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-czqvd" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.422134 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.423039 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.424284 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5h5sc" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.429342 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.430403 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.436132 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nc2qd" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.439853 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.442754 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdt9\" (UniqueName: \"kubernetes.io/projected/5602b15d-928b-4138-a7f0-66f8e8d037b8-kube-api-access-szdt9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-kwh58\" (UID: \"5602b15d-928b-4138-a7f0-66f8e8d037b8\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.442812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqrgn\" (UniqueName: \"kubernetes.io/projected/94d241ed-64bc-4152-b445-51ae5a61bb95-kube-api-access-kqrgn\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.442848 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.442896 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t59gq\" (UniqueName: \"kubernetes.io/projected/2a85b060-3965-4d51-b568-2b360fee4c44-kube-api-access-t59gq\") pod \"manila-operator-controller-manager-68f45f9d9f-9pvn7\" (UID: \"2a85b060-3965-4d51-b568-2b360fee4c44\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.444727 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsdwg\" (UniqueName: \"kubernetes.io/projected/dedc5b41-d549-4015-b010-bc07cea3d318-kube-api-access-wsdwg\") pod \"placement-operator-controller-manager-574d45c66c-57m9k\" (UID: \"dedc5b41-d549-4015-b010-bc07cea3d318\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.444786 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfltf\" (UniqueName: \"kubernetes.io/projected/6ee90085-fc25-4491-a2fc-9b45d5d8207a-kube-api-access-qfltf\") pod \"neutron-operator-controller-manager-776c5696bf-5gkz6\" (UID: \"6ee90085-fc25-4491-a2fc-9b45d5d8207a\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.444813 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2dxm\" (UniqueName: \"kubernetes.io/projected/c29c2a15-0eb3-41aa-b0b9-710a5ed56a87-kube-api-access-m2dxm\") pod \"mariadb-operator-controller-manager-658d4cdd5-q98j9\" (UID: \"c29c2a15-0eb3-41aa-b0b9-710a5ed56a87\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.444868 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhl5n\" (UniqueName: \"kubernetes.io/projected/2525ca9b-eb81-4fce-86b4-a767db795de6-kube-api-access-dhl5n\") pod \"nova-operator-controller-manager-569cc54c5-c5vgk\" (UID: \"2525ca9b-eb81-4fce-86b4-a767db795de6\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.444943 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkn2s\" (UniqueName: \"kubernetes.io/projected/f6f6b27c-94d8-456d-8d41-19c905065e1d-kube-api-access-fkn2s\") pod \"ovn-operator-controller-manager-bbc5b68f9-xmctv\" (UID: \"f6f6b27c-94d8-456d-8d41-19c905065e1d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.457466 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.458862 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.468035 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2dxm\" (UniqueName: \"kubernetes.io/projected/c29c2a15-0eb3-41aa-b0b9-710a5ed56a87-kube-api-access-m2dxm\") pod \"mariadb-operator-controller-manager-658d4cdd5-q98j9\" (UID: \"c29c2a15-0eb3-41aa-b0b9-710a5ed56a87\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.471100 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.475176 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59gq\" (UniqueName: \"kubernetes.io/projected/2a85b060-3965-4d51-b568-2b360fee4c44-kube-api-access-t59gq\") pod \"manila-operator-controller-manager-68f45f9d9f-9pvn7\" (UID: \"2a85b060-3965-4d51-b568-2b360fee4c44\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.486371 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.491208 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfltf\" (UniqueName: \"kubernetes.io/projected/6ee90085-fc25-4491-a2fc-9b45d5d8207a-kube-api-access-qfltf\") pod \"neutron-operator-controller-manager-776c5696bf-5gkz6\" (UID: \"6ee90085-fc25-4491-a2fc-9b45d5d8207a\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.498282 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.499109 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.500974 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zpq5b" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.516626 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.547221 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqrgn\" (UniqueName: \"kubernetes.io/projected/94d241ed-64bc-4152-b445-51ae5a61bb95-kube-api-access-kqrgn\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.547262 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.547312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsdwg\" (UniqueName: \"kubernetes.io/projected/dedc5b41-d549-4015-b010-bc07cea3d318-kube-api-access-wsdwg\") pod \"placement-operator-controller-manager-574d45c66c-57m9k\" (UID: \"dedc5b41-d549-4015-b010-bc07cea3d318\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.547334 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrw89\" (UniqueName: \"kubernetes.io/projected/15d414b6-a515-4db4-b60c-a2b34004ea9c-kube-api-access-mrw89\") pod \"swift-operator-controller-manager-677c674df7-6rnp4\" (UID: \"15d414b6-a515-4db4-b60c-a2b34004ea9c\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.547378 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhl5n\" (UniqueName: \"kubernetes.io/projected/2525ca9b-eb81-4fce-86b4-a767db795de6-kube-api-access-dhl5n\") pod \"nova-operator-controller-manager-569cc54c5-c5vgk\" (UID: \"2525ca9b-eb81-4fce-86b4-a767db795de6\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.547422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkn2s\" (UniqueName: \"kubernetes.io/projected/f6f6b27c-94d8-456d-8d41-19c905065e1d-kube-api-access-fkn2s\") pod \"ovn-operator-controller-manager-bbc5b68f9-xmctv\" (UID: \"f6f6b27c-94d8-456d-8d41-19c905065e1d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.547444 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szdt9\" (UniqueName: \"kubernetes.io/projected/5602b15d-928b-4138-a7f0-66f8e8d037b8-kube-api-access-szdt9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-kwh58\" (UID: \"5602b15d-928b-4138-a7f0-66f8e8d037b8\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.547866 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.547904 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert podName:94d241ed-64bc-4152-b445-51ae5a61bb95 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:34.047891615 +0000 UTC m=+1061.829042304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" (UID: "94d241ed-64bc-4152-b445-51ae5a61bb95") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.564450 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.565326 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.575966 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2ppqd" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.578179 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqrgn\" (UniqueName: \"kubernetes.io/projected/94d241ed-64bc-4152-b445-51ae5a61bb95-kube-api-access-kqrgn\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.584685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsdwg\" (UniqueName: \"kubernetes.io/projected/dedc5b41-d549-4015-b010-bc07cea3d318-kube-api-access-wsdwg\") pod \"placement-operator-controller-manager-574d45c66c-57m9k\" (UID: \"dedc5b41-d549-4015-b010-bc07cea3d318\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.585398 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.586852 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhl5n\" (UniqueName: \"kubernetes.io/projected/2525ca9b-eb81-4fce-86b4-a767db795de6-kube-api-access-dhl5n\") pod \"nova-operator-controller-manager-569cc54c5-c5vgk\" (UID: \"2525ca9b-eb81-4fce-86b4-a767db795de6\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.587243 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkn2s\" (UniqueName: \"kubernetes.io/projected/f6f6b27c-94d8-456d-8d41-19c905065e1d-kube-api-access-fkn2s\") pod \"ovn-operator-controller-manager-bbc5b68f9-xmctv\" (UID: \"f6f6b27c-94d8-456d-8d41-19c905065e1d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.588285 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdt9\" (UniqueName: \"kubernetes.io/projected/5602b15d-928b-4138-a7f0-66f8e8d037b8-kube-api-access-szdt9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-kwh58\" (UID: \"5602b15d-928b-4138-a7f0-66f8e8d037b8\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.627464 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.645072 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.646365 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.650486 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.650700 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrw89\" (UniqueName: \"kubernetes.io/projected/15d414b6-a515-4db4-b60c-a2b34004ea9c-kube-api-access-mrw89\") pod \"swift-operator-controller-manager-677c674df7-6rnp4\" (UID: \"15d414b6-a515-4db4-b60c-a2b34004ea9c\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.678505 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.679484 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-r572v" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.680365 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.693193 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.713298 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.715284 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.734361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrw89\" (UniqueName: \"kubernetes.io/projected/15d414b6-a515-4db4-b60c-a2b34004ea9c-kube-api-access-mrw89\") pod \"swift-operator-controller-manager-677c674df7-6rnp4\" (UID: \"15d414b6-a515-4db4-b60c-a2b34004ea9c\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.756388 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpqgr\" (UniqueName: \"kubernetes.io/projected/f488d4b3-55b5-424e-b00e-0bd262fc5f4f-kube-api-access-fpqgr\") pod \"test-operator-controller-manager-5c5cb9c4d7-jhcz9\" (UID: \"f488d4b3-55b5-424e-b00e-0bd262fc5f4f\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.756455 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpq8m\" (UniqueName: \"kubernetes.io/projected/c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0-kube-api-access-gpq8m\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-jkths\" (UID: \"c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.756555 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.757349 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.763412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5zzsd" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.777618 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.790307 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.831348 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.842961 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.843879 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.845877 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x665f" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.846138 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.846141 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.848152 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.855006 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.856489 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864038 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpqgr\" (UniqueName: \"kubernetes.io/projected/f488d4b3-55b5-424e-b00e-0bd262fc5f4f-kube-api-access-fpqgr\") pod \"test-operator-controller-manager-5c5cb9c4d7-jhcz9\" (UID: \"f488d4b3-55b5-424e-b00e-0bd262fc5f4f\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864175 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpq8m\" (UniqueName: \"kubernetes.io/projected/c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0-kube-api-access-gpq8m\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-jkths\" (UID: \"c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864241 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864310 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phsks\" (UniqueName: \"kubernetes.io/projected/f24d67a9-4996-4315-8c38-fa4ef58e0a52-kube-api-access-phsks\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864364 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhbzn\" (UniqueName: \"kubernetes.io/projected/733a981c-36a1-442b-8e24-16a7498efc54-kube-api-access-qhbzn\") pod \"watcher-operator-controller-manager-6dd88c6f67-vjktt\" (UID: \"733a981c-36a1-442b-8e24-16a7498efc54\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864399 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbwjd\" (UniqueName: \"kubernetes.io/projected/14e5e0c3-1203-4a07-93bd-94578a7f0cb2-kube-api-access-mbwjd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nkqm4\" (UID: \"14e5e0c3-1203-4a07-93bd-94578a7f0cb2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.864482 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.864643 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.864697 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert podName:74ba9d62-2d47-46a5-bd26-1a81bb0a8484 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:34.864680086 +0000 UTC m=+1062.645830775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert") pod "infra-operator-controller-manager-5995f4446f-ntcnm" (UID: "74ba9d62-2d47-46a5-bd26-1a81bb0a8484") : secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.865779 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-k5cfl" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.883391 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4"] Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.888431 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.924485 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpqgr\" (UniqueName: \"kubernetes.io/projected/f488d4b3-55b5-424e-b00e-0bd262fc5f4f-kube-api-access-fpqgr\") pod \"test-operator-controller-manager-5c5cb9c4d7-jhcz9\" (UID: \"f488d4b3-55b5-424e-b00e-0bd262fc5f4f\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.925329 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpq8m\" (UniqueName: \"kubernetes.io/projected/c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0-kube-api-access-gpq8m\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-jkths\" (UID: \"c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.969995 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbwjd\" (UniqueName: \"kubernetes.io/projected/14e5e0c3-1203-4a07-93bd-94578a7f0cb2-kube-api-access-mbwjd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nkqm4\" (UID: \"14e5e0c3-1203-4a07-93bd-94578a7f0cb2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.970201 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.970257 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.970291 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phsks\" (UniqueName: \"kubernetes.io/projected/f24d67a9-4996-4315-8c38-fa4ef58e0a52-kube-api-access-phsks\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:33 crc kubenswrapper[4830]: I0311 09:31:33.970314 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhbzn\" (UniqueName: \"kubernetes.io/projected/733a981c-36a1-442b-8e24-16a7498efc54-kube-api-access-qhbzn\") pod \"watcher-operator-controller-manager-6dd88c6f67-vjktt\" (UID: \"733a981c-36a1-442b-8e24-16a7498efc54\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.970729 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.970783 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:34.470764061 +0000 UTC m=+1062.251914750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "webhook-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.971069 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 09:31:33 crc kubenswrapper[4830]: E0311 09:31:33.971105 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:34.471093749 +0000 UTC m=+1062.252244438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "metrics-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.000070 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbwjd\" (UniqueName: \"kubernetes.io/projected/14e5e0c3-1203-4a07-93bd-94578a7f0cb2-kube-api-access-mbwjd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-nkqm4\" (UID: \"14e5e0c3-1203-4a07-93bd-94578a7f0cb2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.001127 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhbzn\" (UniqueName: \"kubernetes.io/projected/733a981c-36a1-442b-8e24-16a7498efc54-kube-api-access-qhbzn\") pod \"watcher-operator-controller-manager-6dd88c6f67-vjktt\" (UID: \"733a981c-36a1-442b-8e24-16a7498efc54\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.019263 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phsks\" (UniqueName: \"kubernetes.io/projected/f24d67a9-4996-4315-8c38-fa4ef58e0a52-kube-api-access-phsks\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.041416 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.071696 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.071975 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.072065 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert podName:94d241ed-64bc-4152-b445-51ae5a61bb95 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:35.072046803 +0000 UTC m=+1062.853197492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" (UID: "94d241ed-64bc-4152-b445-51ae5a61bb95") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.148408 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.172707 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.190932 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww"] Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.251233 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.316308 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" event={"ID":"d72c70bc-5f58-4c0f-a584-f352adf175e7","Type":"ContainerStarted","Data":"aeb285bace98627e38a156c16ea9822e7d79c5cb2388a14fdfd239e23c459e5c"} Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.476206 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.476256 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.476379 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.476440 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:35.476426126 +0000 UTC m=+1063.257576815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "metrics-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.476380 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.476469 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:35.476463937 +0000 UTC m=+1063.257614626 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "webhook-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.638842 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm"] Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.654196 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv"] Mar 11 09:31:34 crc kubenswrapper[4830]: W0311 09:31:34.656847 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7442149_a02a_401b_b3bd_c1d470af5b3b.slice/crio-4cbadb84b23f6f22ea11aaac02f5831f7853bf09c480a955d161eaf821f0a12c WatchSource:0}: Error finding container 4cbadb84b23f6f22ea11aaac02f5831f7853bf09c480a955d161eaf821f0a12c: Status 404 returned error can't find the container with id 4cbadb84b23f6f22ea11aaac02f5831f7853bf09c480a955d161eaf821f0a12c Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.666794 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc"] Mar 11 09:31:34 crc kubenswrapper[4830]: W0311 09:31:34.669354 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16121653_f66c_441b_b1e2_8cd3c1e558e4.slice/crio-0484ca9f25a6059fe9a1535643e22bdac401ecac1ef0167ba66841c77764c9f4 WatchSource:0}: Error finding container 0484ca9f25a6059fe9a1535643e22bdac401ecac1ef0167ba66841c77764c9f4: Status 404 returned error can't find the container with id 0484ca9f25a6059fe9a1535643e22bdac401ecac1ef0167ba66841c77764c9f4 Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.699252 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-xg456"] Mar 11 09:31:34 crc kubenswrapper[4830]: W0311 09:31:34.706247 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3112f394_9b8e_43c2_9707_94ac1a2778db.slice/crio-dbd368f8484d60f960aa5928a2a7ab8e6dfce6bc6f46c83821816bbb3c7f03eb WatchSource:0}: Error finding container dbd368f8484d60f960aa5928a2a7ab8e6dfce6bc6f46c83821816bbb3c7f03eb: Status 404 returned error can't find the container with id dbd368f8484d60f960aa5928a2a7ab8e6dfce6bc6f46c83821816bbb3c7f03eb Mar 11 09:31:34 crc kubenswrapper[4830]: I0311 09:31:34.882436 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.882575 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:34 crc kubenswrapper[4830]: E0311 09:31:34.882622 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert podName:74ba9d62-2d47-46a5-bd26-1a81bb0a8484 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:36.88260698 +0000 UTC m=+1064.663757669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert") pod "infra-operator-controller-manager-5995f4446f-ntcnm" (UID: "74ba9d62-2d47-46a5-bd26-1a81bb0a8484") : secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.084753 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.084990 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.085086 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert podName:94d241ed-64bc-4152-b445-51ae5a61bb95 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:37.08506134 +0000 UTC m=+1064.866212029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" (UID: "94d241ed-64bc-4152-b445-51ae5a61bb95") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:35 crc kubenswrapper[4830]: W0311 09:31:35.112315 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2525ca9b_eb81_4fce_86b4_a767db795de6.slice/crio-9a05ab3357336ba06de3606f07c08d97a3e2f9e77f2ff367b77021e9a436e9d0 WatchSource:0}: Error finding container 9a05ab3357336ba06de3606f07c08d97a3e2f9e77f2ff367b77021e9a436e9d0: Status 404 returned error can't find the container with id 9a05ab3357336ba06de3606f07c08d97a3e2f9e77f2ff367b77021e9a436e9d0 Mar 11 09:31:35 crc kubenswrapper[4830]: W0311 09:31:35.116806 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e5e0c3_1203_4a07_93bd_94578a7f0cb2.slice/crio-776fb1b09f9dbf14e78e96671225d5f6842eb1deea904c5035a5a0708baa521e WatchSource:0}: Error finding container 776fb1b09f9dbf14e78e96671225d5f6842eb1deea904c5035a5a0708baa521e: Status 404 returned error can't find the container with id 776fb1b09f9dbf14e78e96671225d5f6842eb1deea904c5035a5a0708baa521e Mar 11 09:31:35 crc kubenswrapper[4830]: W0311 09:31:35.122344 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a85b060_3965_4d51_b568_2b360fee4c44.slice/crio-e79b65251a196b11c379bcc7973340f4d71dc6680257072bf6768d47a8b14af6 WatchSource:0}: Error finding container e79b65251a196b11c379bcc7973340f4d71dc6680257072bf6768d47a8b14af6: Status 404 returned error can't find the container with id e79b65251a196b11c379bcc7973340f4d71dc6680257072bf6768d47a8b14af6 Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.129753 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.135478 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.141382 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.147654 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.154244 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9"] Mar 11 09:31:35 crc kubenswrapper[4830]: W0311 09:31:35.157365 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f6b27c_94d8_456d_8d41_19c905065e1d.slice/crio-a51f4dd5b020cf6040a5a7bd2dca7e5100cb1f016cb4f108a0427ab1556f9fe5 WatchSource:0}: Error finding container a51f4dd5b020cf6040a5a7bd2dca7e5100cb1f016cb4f108a0427ab1556f9fe5: Status 404 returned error can't find the container with id a51f4dd5b020cf6040a5a7bd2dca7e5100cb1f016cb4f108a0427ab1556f9fe5 Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.158918 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.163678 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.168930 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.173352 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.179735 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m"] Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.220315 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m7wqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6d9d6b584d-7cg7m_openstack-operators(283b8bf7-046d-4600-be30-f578a6ec3c4d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.220611 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fpqgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-jhcz9_openstack-operators(f488d4b3-55b5-424e-b00e-0bd262fc5f4f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.221103 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gpq8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-jkths_openstack-operators(c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.221294 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qhbzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-vjktt_openstack-operators(733a981c-36a1-442b-8e24-16a7498efc54): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.226862 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" podUID="c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.227052 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" podUID="283b8bf7-046d-4600-be30-f578a6ec3c4d" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.227076 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" podUID="733a981c-36a1-442b-8e24-16a7498efc54" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.227200 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" podUID="f488d4b3-55b5-424e-b00e-0bd262fc5f4f" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.227490 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g"] Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.233401 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcz5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6bbb499bbc-mtd4g_openstack-operators(fe16642f-b4c0-45e6-b222-83fcc2c3fb5c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.234480 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" podUID="fe16642f-b4c0-45e6-b222-83fcc2c3fb5c" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.236320 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-szdt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-kwh58_openstack-operators(5602b15d-928b-4138-a7f0-66f8e8d037b8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.239240 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" podUID="5602b15d-928b-4138-a7f0-66f8e8d037b8" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.241060 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths"] Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.242628 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mrw89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-6rnp4_openstack-operators(15d414b6-a515-4db4-b60c-a2b34004ea9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.244191 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" podUID="15d414b6-a515-4db4-b60c-a2b34004ea9c" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.249837 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.261435 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.277335 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4"] Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.323906 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" event={"ID":"dedc5b41-d549-4015-b010-bc07cea3d318","Type":"ContainerStarted","Data":"b4d4b4b47eea8afa26562a4a76081a7a795561235cfa1698f1adb0f5a2196666"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.325551 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" event={"ID":"16121653-f66c-441b-b1e2-8cd3c1e558e4","Type":"ContainerStarted","Data":"0484ca9f25a6059fe9a1535643e22bdac401ecac1ef0167ba66841c77764c9f4"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.327055 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" event={"ID":"3112f394-9b8e-43c2-9707-94ac1a2778db","Type":"ContainerStarted","Data":"dbd368f8484d60f960aa5928a2a7ab8e6dfce6bc6f46c83821816bbb3c7f03eb"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.328179 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" event={"ID":"ceffcca8-5182-4f52-b359-e20664c1d527","Type":"ContainerStarted","Data":"237edf3cc4a6bd9df659764181803c74af6f1296eaed9ad69adc64f15fcd0ab9"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.331573 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" event={"ID":"14e5e0c3-1203-4a07-93bd-94578a7f0cb2","Type":"ContainerStarted","Data":"776fb1b09f9dbf14e78e96671225d5f6842eb1deea904c5035a5a0708baa521e"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.333219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" event={"ID":"15d414b6-a515-4db4-b60c-a2b34004ea9c","Type":"ContainerStarted","Data":"bb620b2b0f503bc6b59f3d8f4dbffe85cd56df330559d0706240f6ded92504f8"} Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.334726 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" podUID="15d414b6-a515-4db4-b60c-a2b34004ea9c" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.335588 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" event={"ID":"733a981c-36a1-442b-8e24-16a7498efc54","Type":"ContainerStarted","Data":"d65e7e59b9eb6e9b8a9f569f78df4496d03ec1f43478615dd2e1745121a93587"} Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.336468 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" podUID="733a981c-36a1-442b-8e24-16a7498efc54" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.337412 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" event={"ID":"2525ca9b-eb81-4fce-86b4-a767db795de6","Type":"ContainerStarted","Data":"9a05ab3357336ba06de3606f07c08d97a3e2f9e77f2ff367b77021e9a436e9d0"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.338678 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" event={"ID":"283b8bf7-046d-4600-be30-f578a6ec3c4d","Type":"ContainerStarted","Data":"a24575a102e269bd4b05f2b9123939c416210296dac4f062d1f978a9dadceeeb"} Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.339727 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" podUID="283b8bf7-046d-4600-be30-f578a6ec3c4d" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.340424 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" event={"ID":"c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0","Type":"ContainerStarted","Data":"e012861a1119577bc50524b6d5a5f7852e66d970a28fbf9229f709fc2439be0d"} Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.341555 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" podUID="c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.343156 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" event={"ID":"5602b15d-928b-4138-a7f0-66f8e8d037b8","Type":"ContainerStarted","Data":"1aa21a31f517063452bd43d022b755193bad093491e1cc854ac4fdd89bacf5a0"} Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.352641 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" podUID="5602b15d-928b-4138-a7f0-66f8e8d037b8" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.353539 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" event={"ID":"d7442149-a02a-401b-b3bd-c1d470af5b3b","Type":"ContainerStarted","Data":"4cbadb84b23f6f22ea11aaac02f5831f7853bf09c480a955d161eaf821f0a12c"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.355067 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" event={"ID":"c29c2a15-0eb3-41aa-b0b9-710a5ed56a87","Type":"ContainerStarted","Data":"187e26d4cc384ef75261b3fdc28bfa9903b3d23645045dfd5f8b6515000007c1"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.360882 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" event={"ID":"2a85b060-3965-4d51-b568-2b360fee4c44","Type":"ContainerStarted","Data":"e79b65251a196b11c379bcc7973340f4d71dc6680257072bf6768d47a8b14af6"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.362197 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" event={"ID":"fe16642f-b4c0-45e6-b222-83fcc2c3fb5c","Type":"ContainerStarted","Data":"a007ea79cf3912fb729ceb20cf176e1e27f16a23b3ae551e91c34dfb08c312f8"} Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.365631 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" podUID="fe16642f-b4c0-45e6-b222-83fcc2c3fb5c" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.367132 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" event={"ID":"f6f6b27c-94d8-456d-8d41-19c905065e1d","Type":"ContainerStarted","Data":"a51f4dd5b020cf6040a5a7bd2dca7e5100cb1f016cb4f108a0427ab1556f9fe5"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.368518 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" event={"ID":"1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f","Type":"ContainerStarted","Data":"51e20a0ba36f4bcff3e8463daf9d3b54e73936c5f2d4bae6b3e3792b15568fb6"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.371279 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" event={"ID":"f488d4b3-55b5-424e-b00e-0bd262fc5f4f","Type":"ContainerStarted","Data":"935db3c5e3f708988abac84366ceb1c91dd0dbeea92239506dc5514cb6a3be7e"} Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.372162 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" podUID="f488d4b3-55b5-424e-b00e-0bd262fc5f4f" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.374109 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" event={"ID":"6ee90085-fc25-4491-a2fc-9b45d5d8207a","Type":"ContainerStarted","Data":"d8506d82ba5708a47719df2c2e2e7c7366e21ec000a2adfded26b4869363f6c7"} Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.544440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:35 crc kubenswrapper[4830]: I0311 09:31:35.544531 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.544568 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.544640 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:37.54461927 +0000 UTC m=+1065.325770069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "webhook-server-cert" not found Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.544684 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 09:31:35 crc kubenswrapper[4830]: E0311 09:31:35.544727 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:37.544715842 +0000 UTC m=+1065.325866621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "metrics-server-cert" not found Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.383685 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" podUID="15d414b6-a515-4db4-b60c-a2b34004ea9c" Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.383934 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" podUID="f488d4b3-55b5-424e-b00e-0bd262fc5f4f" Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.384010 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9182d1816c6fdb093d6328f1b0bf39296b9eccfa495f35e2198ec4764fa6288f\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" podUID="fe16642f-b4c0-45e6-b222-83fcc2c3fb5c" Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.388764 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" podUID="283b8bf7-046d-4600-be30-f578a6ec3c4d" Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.389161 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" podUID="5602b15d-928b-4138-a7f0-66f8e8d037b8" Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.389158 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" podUID="733a981c-36a1-442b-8e24-16a7498efc54" Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.390594 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" podUID="c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0" Mar 11 09:31:36 crc kubenswrapper[4830]: I0311 09:31:36.964359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.964482 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:36 crc kubenswrapper[4830]: E0311 09:31:36.964525 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert podName:74ba9d62-2d47-46a5-bd26-1a81bb0a8484 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:40.964512491 +0000 UTC m=+1068.745663180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert") pod "infra-operator-controller-manager-5995f4446f-ntcnm" (UID: "74ba9d62-2d47-46a5-bd26-1a81bb0a8484") : secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:37 crc kubenswrapper[4830]: I0311 09:31:37.168132 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:37 crc kubenswrapper[4830]: E0311 09:31:37.168342 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:37 crc kubenswrapper[4830]: E0311 09:31:37.168398 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert podName:94d241ed-64bc-4152-b445-51ae5a61bb95 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:41.16838172 +0000 UTC m=+1068.949532409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" (UID: "94d241ed-64bc-4152-b445-51ae5a61bb95") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:37 crc kubenswrapper[4830]: I0311 09:31:37.581879 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:37 crc kubenswrapper[4830]: I0311 09:31:37.582306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:37 crc kubenswrapper[4830]: E0311 09:31:37.582068 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 09:31:37 crc kubenswrapper[4830]: E0311 09:31:37.582500 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:41.582485483 +0000 UTC m=+1069.363636172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "metrics-server-cert" not found Mar 11 09:31:37 crc kubenswrapper[4830]: E0311 09:31:37.582448 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 09:31:37 crc kubenswrapper[4830]: E0311 09:31:37.582529 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:41.582523914 +0000 UTC m=+1069.363674603 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "webhook-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: I0311 09:31:41.043377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.043856 4830 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.044031 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert podName:74ba9d62-2d47-46a5-bd26-1a81bb0a8484 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:49.043998541 +0000 UTC m=+1076.825149230 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert") pod "infra-operator-controller-manager-5995f4446f-ntcnm" (UID: "74ba9d62-2d47-46a5-bd26-1a81bb0a8484") : secret "infra-operator-webhook-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: I0311 09:31:41.247128 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.247279 4830 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.247339 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert podName:94d241ed-64bc-4152-b445-51ae5a61bb95 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:49.247324824 +0000 UTC m=+1077.028475513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" (UID: "94d241ed-64bc-4152-b445-51ae5a61bb95") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: I0311 09:31:41.653639 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:41 crc kubenswrapper[4830]: I0311 09:31:41.653798 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.653933 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.653980 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:49.653966311 +0000 UTC m=+1077.435117000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "webhook-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.654053 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 09:31:41 crc kubenswrapper[4830]: E0311 09:31:41.654078 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:31:49.654069134 +0000 UTC m=+1077.435219823 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "metrics-server-cert" not found Mar 11 09:31:48 crc kubenswrapper[4830]: E0311 09:31:48.368297 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978" Mar 11 09:31:48 crc kubenswrapper[4830]: E0311 09:31:48.369079 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsdwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-57m9k_openstack-operators(dedc5b41-d549-4015-b010-bc07cea3d318): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:31:48 crc kubenswrapper[4830]: E0311 09:31:48.370362 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" podUID="dedc5b41-d549-4015-b010-bc07cea3d318" Mar 11 09:31:48 crc kubenswrapper[4830]: E0311 09:31:48.545278 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" podUID="dedc5b41-d549-4015-b010-bc07cea3d318" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.087721 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.093618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74ba9d62-2d47-46a5-bd26-1a81bb0a8484-cert\") pod \"infra-operator-controller-manager-5995f4446f-ntcnm\" (UID: \"74ba9d62-2d47-46a5-bd26-1a81bb0a8484\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.148008 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.290831 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.295721 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/94d241ed-64bc-4152-b445-51ae5a61bb95-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h\" (UID: \"94d241ed-64bc-4152-b445-51ae5a61bb95\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.358536 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.694984 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:49 crc kubenswrapper[4830]: I0311 09:31:49.695137 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:31:49 crc kubenswrapper[4830]: E0311 09:31:49.695168 4830 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 11 09:31:49 crc kubenswrapper[4830]: E0311 09:31:49.695228 4830 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 11 09:31:49 crc kubenswrapper[4830]: E0311 09:31:49.695237 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:32:05.695219224 +0000 UTC m=+1093.476369913 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "metrics-server-cert" not found Mar 11 09:31:49 crc kubenswrapper[4830]: E0311 09:31:49.695263 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs podName:f24d67a9-4996-4315-8c38-fa4ef58e0a52 nodeName:}" failed. No retries permitted until 2026-03-11 09:32:05.695251365 +0000 UTC m=+1093.476402054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs") pod "openstack-operator-controller-manager-6fcc5fcbf7-mw66h" (UID: "f24d67a9-4996-4315-8c38-fa4ef58e0a52") : secret "webhook-server-cert" not found Mar 11 09:31:50 crc kubenswrapper[4830]: E0311 09:31:50.291209 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2" Mar 11 09:31:50 crc kubenswrapper[4830]: E0311 09:31:50.291412 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m2dxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-658d4cdd5-q98j9_openstack-operators(c29c2a15-0eb3-41aa-b0b9-710a5ed56a87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:31:50 crc kubenswrapper[4830]: E0311 09:31:50.292703 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" podUID="c29c2a15-0eb3-41aa-b0b9-710a5ed56a87" Mar 11 09:31:50 crc kubenswrapper[4830]: E0311 09:31:50.557607 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b99cd5e08bd85c6aaf717519187ba7bfeea359e1537d43b73a7364b7c38116e2\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" podUID="c29c2a15-0eb3-41aa-b0b9-710a5ed56a87" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.044448 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.044638 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t59gq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-68f45f9d9f-9pvn7_openstack-operators(2a85b060-3965-4d51-b568-2b360fee4c44): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.045822 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" podUID="2a85b060-3965-4d51-b568-2b360fee4c44" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.554918 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.555131 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mbwjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-nkqm4_openstack-operators(14e5e0c3-1203-4a07-93bd-94578a7f0cb2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.557125 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" podUID="14e5e0c3-1203-4a07-93bd-94578a7f0cb2" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.565488 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" podUID="14e5e0c3-1203-4a07-93bd-94578a7f0cb2" Mar 11 09:31:51 crc kubenswrapper[4830]: E0311 09:31:51.566979 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d89f3ca6e909f34d145a880829f5e63f1b6b2d11c520a9c5bea7ed1c30ce38f4\\\"\"" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" podUID="2a85b060-3965-4d51-b568-2b360fee4c44" Mar 11 09:31:52 crc kubenswrapper[4830]: E0311 09:31:52.165327 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 11 09:31:52 crc kubenswrapper[4830]: E0311 09:31:52.165538 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sbp6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-bcbp5_openstack-operators(1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:31:52 crc kubenswrapper[4830]: E0311 09:31:52.166832 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" podUID="1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f" Mar 11 09:31:52 crc kubenswrapper[4830]: E0311 09:31:52.575570 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" podUID="1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.143742 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553692-7d4kj"] Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.145285 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.148872 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.149075 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.149138 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.150410 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553692-7d4kj"] Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.244124 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2qh\" (UniqueName: \"kubernetes.io/projected/dcf580f9-330c-4fc3-85af-5a92d87a6d79-kube-api-access-4q2qh\") pod \"auto-csr-approver-29553692-7d4kj\" (UID: \"dcf580f9-330c-4fc3-85af-5a92d87a6d79\") " pod="openshift-infra/auto-csr-approver-29553692-7d4kj" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.346679 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2qh\" (UniqueName: \"kubernetes.io/projected/dcf580f9-330c-4fc3-85af-5a92d87a6d79-kube-api-access-4q2qh\") pod \"auto-csr-approver-29553692-7d4kj\" (UID: \"dcf580f9-330c-4fc3-85af-5a92d87a6d79\") " pod="openshift-infra/auto-csr-approver-29553692-7d4kj" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.369319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2qh\" (UniqueName: \"kubernetes.io/projected/dcf580f9-330c-4fc3-85af-5a92d87a6d79-kube-api-access-4q2qh\") pod \"auto-csr-approver-29553692-7d4kj\" (UID: \"dcf580f9-330c-4fc3-85af-5a92d87a6d79\") " pod="openshift-infra/auto-csr-approver-29553692-7d4kj" Mar 11 09:32:00 crc kubenswrapper[4830]: I0311 09:32:00.470891 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" Mar 11 09:32:01 crc kubenswrapper[4830]: E0311 09:32:01.817512 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d" Mar 11 09:32:01 crc kubenswrapper[4830]: E0311 09:32:01.818059 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gpq8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-jkths_openstack-operators(c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:32:01 crc kubenswrapper[4830]: E0311 09:32:01.820078 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" podUID="c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0" Mar 11 09:32:02 crc kubenswrapper[4830]: I0311 09:32:02.686855 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" Mar 11 09:32:02 crc kubenswrapper[4830]: I0311 09:32:02.707462 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" podStartSLOduration=13.215014052 podStartE2EDuration="30.707437543s" podCreationTimestamp="2026-03-11 09:31:32 +0000 UTC" firstStartedPulling="2026-03-11 09:31:34.65083864 +0000 UTC m=+1062.431989329" lastFinishedPulling="2026-03-11 09:31:52.143262131 +0000 UTC m=+1079.924412820" observedRunningTime="2026-03-11 09:32:02.704288326 +0000 UTC m=+1090.485439035" watchObservedRunningTime="2026-03-11 09:32:02.707437543 +0000 UTC m=+1090.488588232" Mar 11 09:32:02 crc kubenswrapper[4830]: I0311 09:32:02.813610 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h"] Mar 11 09:32:02 crc kubenswrapper[4830]: I0311 09:32:02.895954 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm"] Mar 11 09:32:02 crc kubenswrapper[4830]: I0311 09:32:02.994536 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553692-7d4kj"] Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.723450 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" event={"ID":"6ee90085-fc25-4491-a2fc-9b45d5d8207a","Type":"ContainerStarted","Data":"59c11b8f932265ad9169d3c907abe3800f75ae17a7d567ceba043ed4e8440b47"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.723760 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.741711 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" event={"ID":"ceffcca8-5182-4f52-b359-e20664c1d527","Type":"ContainerStarted","Data":"bfa04d1ce768c70768397e3c2a3b89a2b4e43312812d2e225c5219860309a79a"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.744400 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" event={"ID":"74ba9d62-2d47-46a5-bd26-1a81bb0a8484","Type":"ContainerStarted","Data":"9d3a4513c27d4aa8814a80cf267de7abbb739a87e2d2917771311fbdea9d3339"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.756611 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" event={"ID":"16121653-f66c-441b-b1e2-8cd3c1e558e4","Type":"ContainerStarted","Data":"fe8b83fdb08a88aee99e1c05dcfdc09455dd8400497fe5239894abf2548dfc75"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.756668 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.759246 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" podStartSLOduration=12.179484528 podStartE2EDuration="30.759235633s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.17908882 +0000 UTC m=+1062.960239509" lastFinishedPulling="2026-03-11 09:31:53.758839925 +0000 UTC m=+1081.539990614" observedRunningTime="2026-03-11 09:32:03.754679677 +0000 UTC m=+1091.535830366" watchObservedRunningTime="2026-03-11 09:32:03.759235633 +0000 UTC m=+1091.540386322" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.762701 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" event={"ID":"733a981c-36a1-442b-8e24-16a7498efc54","Type":"ContainerStarted","Data":"c02b19a0ce0149819712ce84cf64da8fe5d53a8dfb264a3f97f9db7528fe10d9"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.763206 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.765862 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" event={"ID":"2525ca9b-eb81-4fce-86b4-a767db795de6","Type":"ContainerStarted","Data":"349dc1beff7922cdf621f38ce065952485dfed26c5a91166938f26de203c2d84"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.766375 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.771182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" event={"ID":"fe16642f-b4c0-45e6-b222-83fcc2c3fb5c","Type":"ContainerStarted","Data":"bb9654de907296d56f8fdc5507605d50aea3a74d23381f5532f395063ece2c85"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.771769 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.776552 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" event={"ID":"f6f6b27c-94d8-456d-8d41-19c905065e1d","Type":"ContainerStarted","Data":"ea84a2cbc37f4f3206b85e3deaf7849d93911d8096c226684359b1bc22557a3e"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.777080 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.784092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" event={"ID":"d72c70bc-5f58-4c0f-a584-f352adf175e7","Type":"ContainerStarted","Data":"d3f67c6fe934cf5b14e85afbd2dd3ef70264638cdcd8e49b8ec78c02b2f4a01b"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.784779 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.800471 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" event={"ID":"94d241ed-64bc-4152-b445-51ae5a61bb95","Type":"ContainerStarted","Data":"7e73ff7e526ad89ec6c2510d4ae3b007d29b435a7d12517fd07a728ba37ba04f"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.844728 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" event={"ID":"d7442149-a02a-401b-b3bd-c1d470af5b3b","Type":"ContainerStarted","Data":"0828563f61c19e3b1afe42ec04c125fd4596b7d1274ede1f807b19a4aacf918d"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.850817 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" podStartSLOduration=3.648732538 podStartE2EDuration="30.850792206s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.233216288 +0000 UTC m=+1063.014366977" lastFinishedPulling="2026-03-11 09:32:02.435275956 +0000 UTC m=+1090.216426645" observedRunningTime="2026-03-11 09:32:03.831211794 +0000 UTC m=+1091.612362483" watchObservedRunningTime="2026-03-11 09:32:03.850792206 +0000 UTC m=+1091.631942905" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.852495 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.859648 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" event={"ID":"dcf580f9-330c-4fc3-85af-5a92d87a6d79","Type":"ContainerStarted","Data":"81e31868693a08e6bdaa12060f1f6593a1d28c4d21078bc819a5ef98419b167c"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.860569 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" podStartSLOduration=14.388984702 podStartE2EDuration="31.860547746s" podCreationTimestamp="2026-03-11 09:31:32 +0000 UTC" firstStartedPulling="2026-03-11 09:31:34.671608805 +0000 UTC m=+1062.452759494" lastFinishedPulling="2026-03-11 09:31:52.143171839 +0000 UTC m=+1079.924322538" observedRunningTime="2026-03-11 09:32:03.784991006 +0000 UTC m=+1091.566141695" watchObservedRunningTime="2026-03-11 09:32:03.860547746 +0000 UTC m=+1091.641698435" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.893522 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" event={"ID":"15d414b6-a515-4db4-b60c-a2b34004ea9c","Type":"ContainerStarted","Data":"b8c5a6594db24073f6eed723cc0ca05ba187ca897cf77f2f23a9f5a5336f954e"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.894074 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.926579 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" podStartSLOduration=12.896776218 podStartE2EDuration="30.926563962s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.115300186 +0000 UTC m=+1062.896450875" lastFinishedPulling="2026-03-11 09:31:53.14508792 +0000 UTC m=+1080.926238619" observedRunningTime="2026-03-11 09:32:03.881252449 +0000 UTC m=+1091.662403138" watchObservedRunningTime="2026-03-11 09:32:03.926563962 +0000 UTC m=+1091.707714651" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.927069 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" event={"ID":"5602b15d-928b-4138-a7f0-66f8e8d037b8","Type":"ContainerStarted","Data":"21dbf73e609149205f288a0841c766c5899b4418ec31d2e77319b59ceed2db5c"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.927311 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.928641 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" podStartSLOduration=3.7260013929999998 podStartE2EDuration="30.928634318s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.221138983 +0000 UTC m=+1063.002289672" lastFinishedPulling="2026-03-11 09:32:02.423771908 +0000 UTC m=+1090.204922597" observedRunningTime="2026-03-11 09:32:03.922280833 +0000 UTC m=+1091.703431512" watchObservedRunningTime="2026-03-11 09:32:03.928634318 +0000 UTC m=+1091.709785007" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.936305 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" event={"ID":"f488d4b3-55b5-424e-b00e-0bd262fc5f4f","Type":"ContainerStarted","Data":"7c83b24391386b7720f42ed7ae4d85e89fc34b8d3d2bb50735e76ff2fa2e77b0"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.936741 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.944773 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" podStartSLOduration=12.966027683 podStartE2EDuration="30.944758305s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.161240846 +0000 UTC m=+1062.942391535" lastFinishedPulling="2026-03-11 09:31:53.139971468 +0000 UTC m=+1080.921122157" observedRunningTime="2026-03-11 09:32:03.943383507 +0000 UTC m=+1091.724534196" watchObservedRunningTime="2026-03-11 09:32:03.944758305 +0000 UTC m=+1091.725908994" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.950351 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" event={"ID":"dedc5b41-d549-4015-b010-bc07cea3d318","Type":"ContainerStarted","Data":"bb7663ec3d697ce6530a51517965355df29a7ae571cf154b38aa6ca719f1f753"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.950898 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.962467 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" event={"ID":"283b8bf7-046d-4600-be30-f578a6ec3c4d","Type":"ContainerStarted","Data":"4dd7c3ec730d1606face4f4b250bb6a4460025fcecb2443915bc7f88c391278c"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.963198 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.964359 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" event={"ID":"3112f394-9b8e-43c2-9707-94ac1a2778db","Type":"ContainerStarted","Data":"1b57a75ca00a9e9de7345d250c853a47c59fe165530d573c84e8a4caabc24d57"} Mar 11 09:32:03 crc kubenswrapper[4830]: I0311 09:32:03.964722 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.015893 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" podStartSLOduration=3.86395038 podStartE2EDuration="31.015871922s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.220487145 +0000 UTC m=+1063.001637834" lastFinishedPulling="2026-03-11 09:32:02.372408687 +0000 UTC m=+1090.153559376" observedRunningTime="2026-03-11 09:32:04.014066381 +0000 UTC m=+1091.795217080" watchObservedRunningTime="2026-03-11 09:32:04.015871922 +0000 UTC m=+1091.797022611" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.016233 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" podStartSLOduration=14.112394011 podStartE2EDuration="32.016227231s" podCreationTimestamp="2026-03-11 09:31:32 +0000 UTC" firstStartedPulling="2026-03-11 09:31:34.239519544 +0000 UTC m=+1062.020670233" lastFinishedPulling="2026-03-11 09:31:52.143352744 +0000 UTC m=+1079.924503453" observedRunningTime="2026-03-11 09:32:03.973344025 +0000 UTC m=+1091.754494714" watchObservedRunningTime="2026-03-11 09:32:04.016227231 +0000 UTC m=+1091.797377920" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.052448 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" podStartSLOduration=3.940953741 podStartE2EDuration="31.052427103s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.233476915 +0000 UTC m=+1063.014627604" lastFinishedPulling="2026-03-11 09:32:02.344950277 +0000 UTC m=+1090.126100966" observedRunningTime="2026-03-11 09:32:04.049472631 +0000 UTC m=+1091.830623340" watchObservedRunningTime="2026-03-11 09:32:04.052427103 +0000 UTC m=+1091.833577792" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.095986 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" podStartSLOduration=3.993941155 podStartE2EDuration="31.095964387s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.242476263 +0000 UTC m=+1063.023626952" lastFinishedPulling="2026-03-11 09:32:02.344499495 +0000 UTC m=+1090.125650184" observedRunningTime="2026-03-11 09:32:04.078075042 +0000 UTC m=+1091.859225741" watchObservedRunningTime="2026-03-11 09:32:04.095964387 +0000 UTC m=+1091.877115076" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.107267 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" podStartSLOduration=3.795002493 podStartE2EDuration="31.107248609s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.179422119 +0000 UTC m=+1062.960572808" lastFinishedPulling="2026-03-11 09:32:02.491668225 +0000 UTC m=+1090.272818924" observedRunningTime="2026-03-11 09:32:04.102360054 +0000 UTC m=+1091.883510753" watchObservedRunningTime="2026-03-11 09:32:04.107248609 +0000 UTC m=+1091.888399288" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.169366 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" podStartSLOduration=13.734963749 podStartE2EDuration="31.169344696s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:34.708598927 +0000 UTC m=+1062.489749616" lastFinishedPulling="2026-03-11 09:31:52.142979874 +0000 UTC m=+1079.924130563" observedRunningTime="2026-03-11 09:32:04.135570752 +0000 UTC m=+1091.916721451" watchObservedRunningTime="2026-03-11 09:32:04.169344696 +0000 UTC m=+1091.950495385" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.189073 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" podStartSLOduration=3.918761115 podStartE2EDuration="31.189057931s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.220134345 +0000 UTC m=+1063.001285034" lastFinishedPulling="2026-03-11 09:32:02.490431151 +0000 UTC m=+1090.271581850" observedRunningTime="2026-03-11 09:32:04.185450262 +0000 UTC m=+1091.966600961" watchObservedRunningTime="2026-03-11 09:32:04.189057931 +0000 UTC m=+1091.970208620" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.218534 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" podStartSLOduration=14.737554102 podStartE2EDuration="32.218511766s" podCreationTimestamp="2026-03-11 09:31:32 +0000 UTC" firstStartedPulling="2026-03-11 09:31:34.662294817 +0000 UTC m=+1062.443445506" lastFinishedPulling="2026-03-11 09:31:52.143252481 +0000 UTC m=+1079.924403170" observedRunningTime="2026-03-11 09:32:04.21178388 +0000 UTC m=+1091.992934579" watchObservedRunningTime="2026-03-11 09:32:04.218511766 +0000 UTC m=+1091.999662475" Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.996727 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" event={"ID":"2a85b060-3965-4d51-b568-2b360fee4c44","Type":"ContainerStarted","Data":"741c66aa39b77eb8be553e113affc6fcb49d9533d8b1832633b5123e745b2666"} Mar 11 09:32:04 crc kubenswrapper[4830]: I0311 09:32:04.997634 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:04.999220 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" event={"ID":"dcf580f9-330c-4fc3-85af-5a92d87a6d79","Type":"ContainerStarted","Data":"ebdfdf4bf38a9b39b6cf4119f1a1294f6fdbf9ad56e0094b52538ccc2ce2c9b7"} Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.001354 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" event={"ID":"14e5e0c3-1203-4a07-93bd-94578a7f0cb2","Type":"ContainerStarted","Data":"841a9b026bef9ce3590f72769a408090ae27116d7ecb2928f24dd91031d00c82"} Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.018217 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" podStartSLOduration=2.652613467 podStartE2EDuration="32.018205384s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.148989428 +0000 UTC m=+1062.930140117" lastFinishedPulling="2026-03-11 09:32:04.514581345 +0000 UTC m=+1092.295732034" observedRunningTime="2026-03-11 09:32:05.017199346 +0000 UTC m=+1092.798350035" watchObservedRunningTime="2026-03-11 09:32:05.018205384 +0000 UTC m=+1092.799356073" Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.037252 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" podStartSLOduration=3.801792009 podStartE2EDuration="5.03723728s" podCreationTimestamp="2026-03-11 09:32:00 +0000 UTC" firstStartedPulling="2026-03-11 09:32:03.014206867 +0000 UTC m=+1090.795357556" lastFinishedPulling="2026-03-11 09:32:04.249652138 +0000 UTC m=+1092.030802827" observedRunningTime="2026-03-11 09:32:05.034114884 +0000 UTC m=+1092.815265573" watchObservedRunningTime="2026-03-11 09:32:05.03723728 +0000 UTC m=+1092.818387969" Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.760053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.760523 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.765570 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-webhook-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.775579 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f24d67a9-4996-4315-8c38-fa4ef58e0a52-metrics-certs\") pod \"openstack-operator-controller-manager-6fcc5fcbf7-mw66h\" (UID: \"f24d67a9-4996-4315-8c38-fa4ef58e0a52\") " pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:32:05 crc kubenswrapper[4830]: I0311 09:32:05.951498 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-nkqm4" podStartSLOduration=3.6503047410000002 podStartE2EDuration="32.951482016s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.122627949 +0000 UTC m=+1062.903778638" lastFinishedPulling="2026-03-11 09:32:04.423805224 +0000 UTC m=+1092.204955913" observedRunningTime="2026-03-11 09:32:05.05132984 +0000 UTC m=+1092.832480549" watchObservedRunningTime="2026-03-11 09:32:05.951482016 +0000 UTC m=+1093.732632705" Mar 11 09:32:06 crc kubenswrapper[4830]: I0311 09:32:06.008515 4830 generic.go:334] "Generic (PLEG): container finished" podID="dcf580f9-330c-4fc3-85af-5a92d87a6d79" containerID="ebdfdf4bf38a9b39b6cf4119f1a1294f6fdbf9ad56e0094b52538ccc2ce2c9b7" exitCode=0 Mar 11 09:32:06 crc kubenswrapper[4830]: I0311 09:32:06.008579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" event={"ID":"dcf580f9-330c-4fc3-85af-5a92d87a6d79","Type":"ContainerDied","Data":"ebdfdf4bf38a9b39b6cf4119f1a1294f6fdbf9ad56e0094b52538ccc2ce2c9b7"} Mar 11 09:32:06 crc kubenswrapper[4830]: I0311 09:32:06.069215 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-x665f" Mar 11 09:32:06 crc kubenswrapper[4830]: I0311 09:32:06.080357 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:32:07 crc kubenswrapper[4830]: I0311 09:32:07.482618 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" Mar 11 09:32:07 crc kubenswrapper[4830]: I0311 09:32:07.494159 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q2qh\" (UniqueName: \"kubernetes.io/projected/dcf580f9-330c-4fc3-85af-5a92d87a6d79-kube-api-access-4q2qh\") pod \"dcf580f9-330c-4fc3-85af-5a92d87a6d79\" (UID: \"dcf580f9-330c-4fc3-85af-5a92d87a6d79\") " Mar 11 09:32:07 crc kubenswrapper[4830]: I0311 09:32:07.503818 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf580f9-330c-4fc3-85af-5a92d87a6d79-kube-api-access-4q2qh" (OuterVolumeSpecName: "kube-api-access-4q2qh") pod "dcf580f9-330c-4fc3-85af-5a92d87a6d79" (UID: "dcf580f9-330c-4fc3-85af-5a92d87a6d79"). InnerVolumeSpecName "kube-api-access-4q2qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:32:07 crc kubenswrapper[4830]: I0311 09:32:07.596720 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q2qh\" (UniqueName: \"kubernetes.io/projected/dcf580f9-330c-4fc3-85af-5a92d87a6d79-kube-api-access-4q2qh\") on node \"crc\" DevicePath \"\"" Mar 11 09:32:07 crc kubenswrapper[4830]: I0311 09:32:07.711709 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h"] Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.042991 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" event={"ID":"1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f","Type":"ContainerStarted","Data":"2d1bc1fff8a8c395bc48c3e37d8b67e48dfd45ef95922c845d9fa9db82183d29"} Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.043706 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.044749 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" event={"ID":"94d241ed-64bc-4152-b445-51ae5a61bb95","Type":"ContainerStarted","Data":"3a9d133ac59fe80e2d1f3fd4f4e799a956ee0f22398ab0b122adfcf4391b2803"} Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.044881 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.046593 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" event={"ID":"c29c2a15-0eb3-41aa-b0b9-710a5ed56a87","Type":"ContainerStarted","Data":"8f42af2c2e740c8e4e7033fdae80f25d751d0d640b0c2e6dbac4d209c819d58d"} Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.046827 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.048240 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" event={"ID":"dcf580f9-330c-4fc3-85af-5a92d87a6d79","Type":"ContainerDied","Data":"81e31868693a08e6bdaa12060f1f6593a1d28c4d21078bc819a5ef98419b167c"} Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.048341 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81e31868693a08e6bdaa12060f1f6593a1d28c4d21078bc819a5ef98419b167c" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.048267 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553692-7d4kj" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.057827 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" podStartSLOduration=2.920062465 podStartE2EDuration="35.057814033s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.179662247 +0000 UTC m=+1062.960812946" lastFinishedPulling="2026-03-11 09:32:07.317413825 +0000 UTC m=+1095.098564514" observedRunningTime="2026-03-11 09:32:08.057763412 +0000 UTC m=+1095.838914141" watchObservedRunningTime="2026-03-11 09:32:08.057814033 +0000 UTC m=+1095.838964722" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.064911 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" event={"ID":"74ba9d62-2d47-46a5-bd26-1a81bb0a8484","Type":"ContainerStarted","Data":"47e78ff060ceab36260550d86d4b878b3053b071e929493fff9196d263576484"} Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.064988 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.066801 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" event={"ID":"f24d67a9-4996-4315-8c38-fa4ef58e0a52","Type":"ContainerStarted","Data":"872712f088a5a53ee10835dab39916cc190ace0b4831ae6b06dc5717685b0cc9"} Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.066830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" event={"ID":"f24d67a9-4996-4315-8c38-fa4ef58e0a52","Type":"ContainerStarted","Data":"307bf067801c97ca398747a3d9c28a1e744d92633059268d71aa293f01c9ce53"} Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.066976 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.090490 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" podStartSLOduration=30.609497093 podStartE2EDuration="35.090465176s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:32:02.834301212 +0000 UTC m=+1090.615451891" lastFinishedPulling="2026-03-11 09:32:07.315269275 +0000 UTC m=+1095.096419974" observedRunningTime="2026-03-11 09:32:08.085235461 +0000 UTC m=+1095.866386170" watchObservedRunningTime="2026-03-11 09:32:08.090465176 +0000 UTC m=+1095.871615905" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.121795 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" podStartSLOduration=2.957790777 podStartE2EDuration="35.121773571s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.154221442 +0000 UTC m=+1062.935372121" lastFinishedPulling="2026-03-11 09:32:07.318204206 +0000 UTC m=+1095.099354915" observedRunningTime="2026-03-11 09:32:08.120146167 +0000 UTC m=+1095.901296866" watchObservedRunningTime="2026-03-11 09:32:08.121773571 +0000 UTC m=+1095.902924270" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.154682 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" podStartSLOduration=35.154664392 podStartE2EDuration="35.154664392s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:32:08.151847644 +0000 UTC m=+1095.932998363" watchObservedRunningTime="2026-03-11 09:32:08.154664392 +0000 UTC m=+1095.935815091" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.519804 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" podStartSLOduration=31.122795138 podStartE2EDuration="35.519779769s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:32:02.948170651 +0000 UTC m=+1090.729321340" lastFinishedPulling="2026-03-11 09:32:07.345155262 +0000 UTC m=+1095.126305971" observedRunningTime="2026-03-11 09:32:08.181838873 +0000 UTC m=+1095.962989582" watchObservedRunningTime="2026-03-11 09:32:08.519779769 +0000 UTC m=+1096.300930458" Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.552069 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553686-xg792"] Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.556687 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553686-xg792"] Mar 11 09:32:08 crc kubenswrapper[4830]: I0311 09:32:08.939917 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c18180-afd3-4329-bc8c-8bf32ab5e82c" path="/var/lib/kubelet/pods/e1c18180-afd3-4329-bc8c-8bf32ab5e82c/volumes" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.295987 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.314562 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-rjsvm" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.337676 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.373875 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-7kzdv" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.415584 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-xg456" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.444465 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-7cg7m" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.461267 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-mtd4g" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.632230 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.655825 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.682146 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-5gkz6" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.686756 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-q98j9" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.701726 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-c5vgk" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.716178 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-kwh58" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.780289 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-xmctv" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.835094 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-57m9k" Mar 11 09:32:13 crc kubenswrapper[4830]: I0311 09:32:13.891891 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-6rnp4" Mar 11 09:32:14 crc kubenswrapper[4830]: I0311 09:32:14.175560 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-jhcz9" Mar 11 09:32:14 crc kubenswrapper[4830]: I0311 09:32:14.254374 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-vjktt" Mar 11 09:32:16 crc kubenswrapper[4830]: I0311 09:32:16.087999 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6fcc5fcbf7-mw66h" Mar 11 09:32:16 crc kubenswrapper[4830]: E0311 09:32:16.934959 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" podUID="c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0" Mar 11 09:32:19 crc kubenswrapper[4830]: I0311 09:32:19.154489 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-ntcnm" Mar 11 09:32:19 crc kubenswrapper[4830]: I0311 09:32:19.365424 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h" Mar 11 09:32:30 crc kubenswrapper[4830]: I0311 09:32:30.232587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" event={"ID":"c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0","Type":"ContainerStarted","Data":"b956f5961c15f190bf29c554e860d9103e56e4b5dbaad89631b7baa9c8ab3f51"} Mar 11 09:32:30 crc kubenswrapper[4830]: I0311 09:32:30.233279 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" Mar 11 09:32:30 crc kubenswrapper[4830]: I0311 09:32:30.258448 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" podStartSLOduration=3.017948578 podStartE2EDuration="57.258428313s" podCreationTimestamp="2026-03-11 09:31:33 +0000 UTC" firstStartedPulling="2026-03-11 09:31:35.220862466 +0000 UTC m=+1063.002013145" lastFinishedPulling="2026-03-11 09:32:29.461342181 +0000 UTC m=+1117.242492880" observedRunningTime="2026-03-11 09:32:30.252138888 +0000 UTC m=+1118.033289587" watchObservedRunningTime="2026-03-11 09:32:30.258428313 +0000 UTC m=+1118.039579022" Mar 11 09:32:34 crc kubenswrapper[4830]: I0311 09:32:34.151996 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-jkths" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.596879 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4sq7m"] Mar 11 09:32:50 crc kubenswrapper[4830]: E0311 09:32:50.597546 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf580f9-330c-4fc3-85af-5a92d87a6d79" containerName="oc" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.597559 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf580f9-330c-4fc3-85af-5a92d87a6d79" containerName="oc" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.597679 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf580f9-330c-4fc3-85af-5a92d87a6d79" containerName="oc" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.598373 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.600857 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.601092 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.601280 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-smshl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.601791 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.611845 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4sq7m"] Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.649782 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9fncl"] Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.650952 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.656221 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.661309 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9fncl"] Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.661559 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9qn\" (UniqueName: \"kubernetes.io/projected/c6b7e449-011e-4574-b449-fd58ec84aadd-kube-api-access-gn9qn\") pod \"dnsmasq-dns-675f4bcbfc-4sq7m\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.661615 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-config\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.661638 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t8d5\" (UniqueName: \"kubernetes.io/projected/f58f0c06-356c-4d77-a927-c97077fe8122-kube-api-access-4t8d5\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.661663 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6b7e449-011e-4574-b449-fd58ec84aadd-config\") pod \"dnsmasq-dns-675f4bcbfc-4sq7m\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.661724 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.762450 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9qn\" (UniqueName: \"kubernetes.io/projected/c6b7e449-011e-4574-b449-fd58ec84aadd-kube-api-access-gn9qn\") pod \"dnsmasq-dns-675f4bcbfc-4sq7m\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.762511 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-config\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.762538 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8d5\" (UniqueName: \"kubernetes.io/projected/f58f0c06-356c-4d77-a927-c97077fe8122-kube-api-access-4t8d5\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.762567 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6b7e449-011e-4574-b449-fd58ec84aadd-config\") pod \"dnsmasq-dns-675f4bcbfc-4sq7m\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.762623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.763647 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6b7e449-011e-4574-b449-fd58ec84aadd-config\") pod \"dnsmasq-dns-675f4bcbfc-4sq7m\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.763715 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.763821 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-config\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.782835 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9qn\" (UniqueName: \"kubernetes.io/projected/c6b7e449-011e-4574-b449-fd58ec84aadd-kube-api-access-gn9qn\") pod \"dnsmasq-dns-675f4bcbfc-4sq7m\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.782964 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8d5\" (UniqueName: \"kubernetes.io/projected/f58f0c06-356c-4d77-a927-c97077fe8122-kube-api-access-4t8d5\") pod \"dnsmasq-dns-78dd6ddcc-9fncl\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.915413 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:32:50 crc kubenswrapper[4830]: I0311 09:32:50.966430 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:32:51 crc kubenswrapper[4830]: I0311 09:32:51.388906 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4sq7m"] Mar 11 09:32:51 crc kubenswrapper[4830]: I0311 09:32:51.428262 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9fncl"] Mar 11 09:32:51 crc kubenswrapper[4830]: W0311 09:32:51.430773 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf58f0c06_356c_4d77_a927_c97077fe8122.slice/crio-95a07e59da5ea2d6b33f66a7c3d9d39015d32ec40425ca1d7bfe596451689b7c WatchSource:0}: Error finding container 95a07e59da5ea2d6b33f66a7c3d9d39015d32ec40425ca1d7bfe596451689b7c: Status 404 returned error can't find the container with id 95a07e59da5ea2d6b33f66a7c3d9d39015d32ec40425ca1d7bfe596451689b7c Mar 11 09:32:52 crc kubenswrapper[4830]: I0311 09:32:52.393649 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" event={"ID":"c6b7e449-011e-4574-b449-fd58ec84aadd","Type":"ContainerStarted","Data":"77d064bca2b20d397b4766b4d9a1c8a97b4c0432334293ea4884b7617bf757d8"} Mar 11 09:32:52 crc kubenswrapper[4830]: I0311 09:32:52.398977 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" event={"ID":"f58f0c06-356c-4d77-a927-c97077fe8122","Type":"ContainerStarted","Data":"95a07e59da5ea2d6b33f66a7c3d9d39015d32ec40425ca1d7bfe596451689b7c"} Mar 11 09:32:54 crc kubenswrapper[4830]: I0311 09:32:54.802416 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" podUID="2a85b060-3965-4d51-b568-2b360fee4c44" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.78:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:32:54 crc kubenswrapper[4830]: I0311 09:32:54.803416 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" podUID="16121653-f66c-441b-b1e2-8cd3c1e558e4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.63:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:32:54 crc kubenswrapper[4830]: I0311 09:32:54.804102 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" podUID="1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:32:54 crc kubenswrapper[4830]: I0311 09:32:54.804136 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-bcbp5" podUID="1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:32:54 crc kubenswrapper[4830]: I0311 09:32:54.804163 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-n5khc" podUID="16121653-f66c-441b-b1e2-8cd3c1e558e4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.63:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:32:54 crc kubenswrapper[4830]: I0311 09:32:54.762571 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-9pvn7" podUID="2a85b060-3965-4d51-b568-2b360fee4c44" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.78:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:32:54 crc kubenswrapper[4830]: I0311 09:32:54.806560 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-cxpww" podUID="d72c70bc-5f58-4c0f-a584-f352adf175e7" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.61:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.394516 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4sq7m"] Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.416655 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fld8l"] Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.417799 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.437309 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fld8l"] Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.506966 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.507090 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttql2\" (UniqueName: \"kubernetes.io/projected/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-kube-api-access-ttql2\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.507133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-config\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.608924 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttql2\" (UniqueName: \"kubernetes.io/projected/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-kube-api-access-ttql2\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.608987 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-config\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.609748 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-config\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.609906 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.610475 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.636432 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttql2\" (UniqueName: \"kubernetes.io/projected/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-kube-api-access-ttql2\") pod \"dnsmasq-dns-5ccc8479f9-fld8l\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.696738 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9fncl"] Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.719923 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gfgnd"] Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.720971 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.739263 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gfgnd"] Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.747430 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.812871 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-config\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.812933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.813224 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tthbs\" (UniqueName: \"kubernetes.io/projected/cdd3efe1-119d-408b-a76d-e98ee494dfde-kube-api-access-tthbs\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.925942 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tthbs\" (UniqueName: \"kubernetes.io/projected/cdd3efe1-119d-408b-a76d-e98ee494dfde-kube-api-access-tthbs\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.926159 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-config\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.926318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.930633 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.930797 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-config\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:55 crc kubenswrapper[4830]: I0311 09:32:55.963419 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tthbs\" (UniqueName: \"kubernetes.io/projected/cdd3efe1-119d-408b-a76d-e98ee494dfde-kube-api-access-tthbs\") pod \"dnsmasq-dns-57d769cc4f-gfgnd\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.056643 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.278118 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.279721 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.286506 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zg7zt" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.287885 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.291242 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.296879 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.297518 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.299871 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340605 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad5f765-f3dd-42f3-9829-2323ea982c58-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340700 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ad5f765-f3dd-42f3-9829-2323ea982c58-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340738 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340781 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340801 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340822 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340845 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8c85\" (UniqueName: \"kubernetes.io/projected/7ad5f765-f3dd-42f3-9829-2323ea982c58-kube-api-access-g8c85\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.340869 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ad5f765-f3dd-42f3-9829-2323ea982c58-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.351534 4830 scope.go:117] "RemoveContainer" containerID="dcbf17aeb6cc373d6e5b4d9db7d9c4370421143343b6e60a03722cd25a9c1dbd" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.366939 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fld8l"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.441920 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ad5f765-f3dd-42f3-9829-2323ea982c58-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442335 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442374 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442406 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442443 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8c85\" (UniqueName: \"kubernetes.io/projected/7ad5f765-f3dd-42f3-9829-2323ea982c58-kube-api-access-g8c85\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442469 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ad5f765-f3dd-42f3-9829-2323ea982c58-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442521 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad5f765-f3dd-42f3-9829-2323ea982c58-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ad5f765-f3dd-42f3-9829-2323ea982c58-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.442969 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.444531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.447250 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-config-data-default\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.450183 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ad5f765-f3dd-42f3-9829-2323ea982c58-kolla-config\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.451531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad5f765-f3dd-42f3-9829-2323ea982c58-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.454765 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ad5f765-f3dd-42f3-9829-2323ea982c58-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.464714 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8c85\" (UniqueName: \"kubernetes.io/projected/7ad5f765-f3dd-42f3-9829-2323ea982c58-kube-api-access-g8c85\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.465050 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"7ad5f765-f3dd-42f3-9829-2323ea982c58\") " pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.552262 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.553461 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.560694 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.560992 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.561163 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8pr4h" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.561352 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.561680 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.561824 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.562386 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.572577 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.611495 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647708 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647760 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647785 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647807 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647823 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647859 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647914 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647937 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbmbg\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-kube-api-access-jbmbg\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647958 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.647988 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749490 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749567 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749598 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749631 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749659 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749683 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749729 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749774 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749824 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749852 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbmbg\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-kube-api-access-jbmbg\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.749879 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.750314 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.750378 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.750321 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.751685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.751935 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.752394 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.754978 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.758609 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.772085 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.777222 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gfgnd"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.785253 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbmbg\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-kube-api-access-jbmbg\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.787785 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.851867 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.864310 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.865865 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.868761 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.869205 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" event={"ID":"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe","Type":"ContainerStarted","Data":"e064b3cd59c490b2516f1bb349cb707014a2cf22c9c1b5f5f685b2205fe96803"} Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.869337 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mrvfm" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.869595 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.869767 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.871236 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" event={"ID":"cdd3efe1-119d-408b-a76d-e98ee494dfde","Type":"ContainerStarted","Data":"b82d11e70f70028b774f5dbeb9ded6fd651da16a87c45ec84977728fb98166b1"} Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.872425 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.873521 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.874122 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.874368 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 11 09:32:56 crc kubenswrapper[4830]: I0311 09:32:56.874630 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055498 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75e77e40-6cb5-47ec-9074-b663b7dba6b4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055571 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055599 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bw9\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-kube-api-access-48bw9\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055632 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75e77e40-6cb5-47ec-9074-b663b7dba6b4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055654 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055680 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055721 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055735 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-config-data\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055766 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.055789 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157341 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-config-data\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157372 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157405 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75e77e40-6cb5-47ec-9074-b663b7dba6b4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157481 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157506 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157691 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.157971 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.158464 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-config-data\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.158543 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48bw9\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-kube-api-access-48bw9\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.158671 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75e77e40-6cb5-47ec-9074-b663b7dba6b4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.158713 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.158718 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.159516 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.159745 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.164417 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.165427 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75e77e40-6cb5-47ec-9074-b663b7dba6b4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.166084 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75e77e40-6cb5-47ec-9074-b663b7dba6b4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.174874 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.176927 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bw9\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-kube-api-access-48bw9\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.195110 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.243831 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.248381 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 09:32:57 crc kubenswrapper[4830]: W0311 09:32:57.253713 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ad5f765_f3dd_42f3_9829_2323ea982c58.slice/crio-d24b9ac109529386b1fb24926ccaf04b4784a4e7d32a8004aa4138aff62bc450 WatchSource:0}: Error finding container d24b9ac109529386b1fb24926ccaf04b4784a4e7d32a8004aa4138aff62bc450: Status 404 returned error can't find the container with id d24b9ac109529386b1fb24926ccaf04b4784a4e7d32a8004aa4138aff62bc450 Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.359701 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:32:57 crc kubenswrapper[4830]: W0311 09:32:57.366913 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf04bb60_c9f6_42a1_bd5b_1ae32d3de4c3.slice/crio-ec34ddf85b926f765458072737642be5ae9e7474b8b74e5cd15ba11843142d80 WatchSource:0}: Error finding container ec34ddf85b926f765458072737642be5ae9e7474b8b74e5cd15ba11843142d80: Status 404 returned error can't find the container with id ec34ddf85b926f765458072737642be5ae9e7474b8b74e5cd15ba11843142d80 Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.443606 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.445218 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.455470 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.458232 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.458755 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.459386 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qd4xj" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.459564 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.564759 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.564848 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.564882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4zg\" (UniqueName: \"kubernetes.io/projected/9ce0c7bf-830f-40f3-850f-19b0a879ba23-kube-api-access-zk4zg\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.565066 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.565194 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce0c7bf-830f-40f3-850f-19b0a879ba23-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.565233 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce0c7bf-830f-40f3-850f-19b0a879ba23-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.565283 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ce0c7bf-830f-40f3-850f-19b0a879ba23-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.565417 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.668824 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.668889 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.668944 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.668971 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4zg\" (UniqueName: \"kubernetes.io/projected/9ce0c7bf-830f-40f3-850f-19b0a879ba23-kube-api-access-zk4zg\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.669007 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.669054 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce0c7bf-830f-40f3-850f-19b0a879ba23-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.669077 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce0c7bf-830f-40f3-850f-19b0a879ba23-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.669114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ce0c7bf-830f-40f3-850f-19b0a879ba23-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.669590 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9ce0c7bf-830f-40f3-850f-19b0a879ba23-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.669969 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.670686 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.671841 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce0c7bf-830f-40f3-850f-19b0a879ba23-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.671965 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.676689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce0c7bf-830f-40f3-850f-19b0a879ba23-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.689833 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce0c7bf-830f-40f3-850f-19b0a879ba23-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.722538 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4zg\" (UniqueName: \"kubernetes.io/projected/9ce0c7bf-830f-40f3-850f-19b0a879ba23-kube-api-access-zk4zg\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.820729 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9ce0c7bf-830f-40f3-850f-19b0a879ba23\") " pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.855349 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.855416 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.856600 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.864683 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.864927 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mksn9" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.865133 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.870268 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.888578 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3","Type":"ContainerStarted","Data":"ec34ddf85b926f765458072737642be5ae9e7474b8b74e5cd15ba11843142d80"} Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.909401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ad5f765-f3dd-42f3-9829-2323ea982c58","Type":"ContainerStarted","Data":"d24b9ac109529386b1fb24926ccaf04b4784a4e7d32a8004aa4138aff62bc450"} Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.973369 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.973422 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6jq\" (UniqueName: \"kubernetes.io/projected/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-kube-api-access-7x6jq\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.973446 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-config-data\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.973501 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:57 crc kubenswrapper[4830]: I0311 09:32:57.973545 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-kolla-config\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.074400 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-kolla-config\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.074524 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.074557 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6jq\" (UniqueName: \"kubernetes.io/projected/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-kube-api-access-7x6jq\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.074583 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-config-data\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.074625 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.076656 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-kolla-config\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.076795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-config-data\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.077195 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.079993 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.081258 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.101668 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6jq\" (UniqueName: \"kubernetes.io/projected/60740879-ec5c-4d1f-bfd0-68ec5e8960f2-kube-api-access-7x6jq\") pod \"memcached-0\" (UID: \"60740879-ec5c-4d1f-bfd0-68ec5e8960f2\") " pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.222690 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.884273 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 11 09:32:58 crc kubenswrapper[4830]: I0311 09:32:58.981910 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75e77e40-6cb5-47ec-9074-b663b7dba6b4","Type":"ContainerStarted","Data":"5e5abd159913c6427c0f7c31e7151bf5e7593ea7c11a507f166f7d4ddbc2320a"} Mar 11 09:32:59 crc kubenswrapper[4830]: I0311 09:32:59.099612 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 11 09:32:59 crc kubenswrapper[4830]: W0311 09:32:59.120889 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60740879_ec5c_4d1f_bfd0_68ec5e8960f2.slice/crio-c6aba9258776c48f0e1ab0fbd248121a6e336a3d34fadb543b8deb3bccf54514 WatchSource:0}: Error finding container c6aba9258776c48f0e1ab0fbd248121a6e336a3d34fadb543b8deb3bccf54514: Status 404 returned error can't find the container with id c6aba9258776c48f0e1ab0fbd248121a6e336a3d34fadb543b8deb3bccf54514 Mar 11 09:32:59 crc kubenswrapper[4830]: I0311 09:32:59.995640 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"60740879-ec5c-4d1f-bfd0-68ec5e8960f2","Type":"ContainerStarted","Data":"c6aba9258776c48f0e1ab0fbd248121a6e336a3d34fadb543b8deb3bccf54514"} Mar 11 09:32:59 crc kubenswrapper[4830]: I0311 09:32:59.997383 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ce0c7bf-830f-40f3-850f-19b0a879ba23","Type":"ContainerStarted","Data":"370def92deb5e050c0d8182416cc12c508f7c1fe01ee994aae52ea1257c7812b"} Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.507087 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.508114 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.514282 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-x6jsw" Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.514578 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.619269 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qf6b\" (UniqueName: \"kubernetes.io/projected/b5731a37-0030-4920-b5c2-ded8262d8e2a-kube-api-access-8qf6b\") pod \"kube-state-metrics-0\" (UID: \"b5731a37-0030-4920-b5c2-ded8262d8e2a\") " pod="openstack/kube-state-metrics-0" Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.721215 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qf6b\" (UniqueName: \"kubernetes.io/projected/b5731a37-0030-4920-b5c2-ded8262d8e2a-kube-api-access-8qf6b\") pod \"kube-state-metrics-0\" (UID: \"b5731a37-0030-4920-b5c2-ded8262d8e2a\") " pod="openstack/kube-state-metrics-0" Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.746574 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qf6b\" (UniqueName: \"kubernetes.io/projected/b5731a37-0030-4920-b5c2-ded8262d8e2a-kube-api-access-8qf6b\") pod \"kube-state-metrics-0\" (UID: \"b5731a37-0030-4920-b5c2-ded8262d8e2a\") " pod="openstack/kube-state-metrics-0" Mar 11 09:33:00 crc kubenswrapper[4830]: I0311 09:33:00.835766 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.232534 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xjsks"] Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.233875 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.235744 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dkk7r" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.238817 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.244141 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.262591 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-klr5s"] Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.268914 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjsks"] Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.269057 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.277712 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-klr5s"] Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.281787 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-lib\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.281857 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-run-ovn\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.281956 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-etc-ovs\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282038 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvxwf\" (UniqueName: \"kubernetes.io/projected/97278c10-fe96-4de6-86cf-09ff64444a59-kube-api-access-tvxwf\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282069 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-run\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282129 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcc9q\" (UniqueName: \"kubernetes.io/projected/d97948cc-fc42-46c8-b46e-3f8efdc251db-kube-api-access-kcc9q\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282186 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d97948cc-fc42-46c8-b46e-3f8efdc251db-ovn-controller-tls-certs\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282216 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97948cc-fc42-46c8-b46e-3f8efdc251db-combined-ca-bundle\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282291 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-run\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282309 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-log\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282323 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97278c10-fe96-4de6-86cf-09ff64444a59-scripts\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282359 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-log-ovn\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.282374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d97948cc-fc42-46c8-b46e-3f8efdc251db-scripts\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383316 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-lib\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-run-ovn\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383396 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-etc-ovs\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383429 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvxwf\" (UniqueName: \"kubernetes.io/projected/97278c10-fe96-4de6-86cf-09ff64444a59-kube-api-access-tvxwf\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383449 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-run\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383474 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcc9q\" (UniqueName: \"kubernetes.io/projected/d97948cc-fc42-46c8-b46e-3f8efdc251db-kube-api-access-kcc9q\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383499 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d97948cc-fc42-46c8-b46e-3f8efdc251db-ovn-controller-tls-certs\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383527 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97948cc-fc42-46c8-b46e-3f8efdc251db-combined-ca-bundle\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383559 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-run\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383578 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-log\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383597 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97278c10-fe96-4de6-86cf-09ff64444a59-scripts\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383618 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-log-ovn\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383639 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d97948cc-fc42-46c8-b46e-3f8efdc251db-scripts\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.383905 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-lib\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.385109 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-run-ovn\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.385223 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-etc-ovs\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.385328 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-log\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.388393 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d97948cc-fc42-46c8-b46e-3f8efdc251db-scripts\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.392763 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d97948cc-fc42-46c8-b46e-3f8efdc251db-ovn-controller-tls-certs\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.392931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/97278c10-fe96-4de6-86cf-09ff64444a59-var-run\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.393361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-log-ovn\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.393416 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d97948cc-fc42-46c8-b46e-3f8efdc251db-var-run\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.397512 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97278c10-fe96-4de6-86cf-09ff64444a59-scripts\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.403682 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvxwf\" (UniqueName: \"kubernetes.io/projected/97278c10-fe96-4de6-86cf-09ff64444a59-kube-api-access-tvxwf\") pod \"ovn-controller-ovs-klr5s\" (UID: \"97278c10-fe96-4de6-86cf-09ff64444a59\") " pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.408068 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97948cc-fc42-46c8-b46e-3f8efdc251db-combined-ca-bundle\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.410784 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcc9q\" (UniqueName: \"kubernetes.io/projected/d97948cc-fc42-46c8-b46e-3f8efdc251db-kube-api-access-kcc9q\") pod \"ovn-controller-xjsks\" (UID: \"d97948cc-fc42-46c8-b46e-3f8efdc251db\") " pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.566144 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjsks" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.604707 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.782964 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.784517 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.790102 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.793288 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.793666 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-gfqbx" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.793898 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.794079 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.800876 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892208 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892248 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892276 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892300 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1424a67-6a71-4943-b855-4795d2427214-config\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892326 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1424a67-6a71-4943-b855-4795d2427214-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892363 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77nxf\" (UniqueName: \"kubernetes.io/projected/a1424a67-6a71-4943-b855-4795d2427214-kube-api-access-77nxf\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892395 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1424a67-6a71-4943-b855-4795d2427214-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.892526 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.993668 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.993750 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:03 crc kubenswrapper[4830]: I0311 09:33:03.993776 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.994648 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1424a67-6a71-4943-b855-4795d2427214-config\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.994689 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1424a67-6a71-4943-b855-4795d2427214-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.994689 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.995062 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77nxf\" (UniqueName: \"kubernetes.io/projected/a1424a67-6a71-4943-b855-4795d2427214-kube-api-access-77nxf\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.995113 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1424a67-6a71-4943-b855-4795d2427214-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.995284 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.995669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1424a67-6a71-4943-b855-4795d2427214-config\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.996191 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1424a67-6a71-4943-b855-4795d2427214-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:03.996194 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1424a67-6a71-4943-b855-4795d2427214-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:04.017040 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:04.021486 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:04.021713 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:04.023310 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1424a67-6a71-4943-b855-4795d2427214-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:04.037462 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77nxf\" (UniqueName: \"kubernetes.io/projected/a1424a67-6a71-4943-b855-4795d2427214-kube-api-access-77nxf\") pod \"ovsdbserver-nb-0\" (UID: \"a1424a67-6a71-4943-b855-4795d2427214\") " pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:04 crc kubenswrapper[4830]: I0311 09:33:04.121516 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.528283 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.530146 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.532203 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.532824 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.532925 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.532988 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-bqtnc" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.546229 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671307 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671379 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkj6c\" (UniqueName: \"kubernetes.io/projected/8f9d7ab5-467c-4888-8759-6e2ef59957e5-kube-api-access-gkj6c\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671423 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671454 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9d7ab5-467c-4888-8759-6e2ef59957e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671472 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f9d7ab5-467c-4888-8759-6e2ef59957e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671497 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f9d7ab5-467c-4888-8759-6e2ef59957e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671606 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.671797 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773506 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773596 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkj6c\" (UniqueName: \"kubernetes.io/projected/8f9d7ab5-467c-4888-8759-6e2ef59957e5-kube-api-access-gkj6c\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773655 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773696 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9d7ab5-467c-4888-8759-6e2ef59957e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773718 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f9d7ab5-467c-4888-8759-6e2ef59957e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773750 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f9d7ab5-467c-4888-8759-6e2ef59957e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773780 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.773822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.774491 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.775306 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8f9d7ab5-467c-4888-8759-6e2ef59957e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.775880 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f9d7ab5-467c-4888-8759-6e2ef59957e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.775929 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f9d7ab5-467c-4888-8759-6e2ef59957e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.780070 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.781450 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.790587 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f9d7ab5-467c-4888-8759-6e2ef59957e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.795031 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.797677 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkj6c\" (UniqueName: \"kubernetes.io/projected/8f9d7ab5-467c-4888-8759-6e2ef59957e5-kube-api-access-gkj6c\") pod \"ovsdbserver-sb-0\" (UID: \"8f9d7ab5-467c-4888-8759-6e2ef59957e5\") " pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:07 crc kubenswrapper[4830]: I0311 09:33:07.862336 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:13 crc kubenswrapper[4830]: I0311 09:33:13.060567 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:33:13 crc kubenswrapper[4830]: I0311 09:33:13.061067 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:33:24 crc kubenswrapper[4830]: E0311 09:33:24.255556 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Mar 11 09:33:24 crc kubenswrapper[4830]: E0311 09:33:24.256249 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n8bh54h55bh58dhf9h575hfh657h5f9h689h55dh7dh668h85h558h698h5b7h5fch7ch697h678h584h55fhcbhb9h667hfch5h596h575hbdh59fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7x6jq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(60740879-ec5c-4d1f-bfd0-68ec5e8960f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:33:24 crc kubenswrapper[4830]: E0311 09:33:24.257753 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="60740879-ec5c-4d1f-bfd0-68ec5e8960f2" Mar 11 09:33:24 crc kubenswrapper[4830]: E0311 09:33:24.267544 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 11 09:33:24 crc kubenswrapper[4830]: E0311 09:33:24.267932 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zk4zg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(9ce0c7bf-830f-40f3-850f-19b0a879ba23): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:33:24 crc kubenswrapper[4830]: E0311 09:33:24.269150 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="9ce0c7bf-830f-40f3-850f-19b0a879ba23" Mar 11 09:33:24 crc kubenswrapper[4830]: I0311 09:33:24.816345 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.218356 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.218926 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gn9qn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-4sq7m_openstack(c6b7e449-011e-4574-b449-fd58ec84aadd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.221142 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" podUID="c6b7e449-011e-4574-b449-fd58ec84aadd" Mar 11 09:33:25 crc kubenswrapper[4830]: I0311 09:33:25.228409 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8f9d7ab5-467c-4888-8759-6e2ef59957e5","Type":"ContainerStarted","Data":"a58bd4800e06786e0cbfe8fd21d4ac4e7d87ff5c0520aaeb65c44979515bdaf9"} Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.230639 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="60740879-ec5c-4d1f-bfd0-68ec5e8960f2" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.233197 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.234329 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ttql2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-fld8l_openstack(69f894d1-e1f9-4274-9422-0a4dbbb2e8fe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.244562 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" podUID="69f894d1-e1f9-4274-9422-0a4dbbb2e8fe" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.292954 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.293115 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tthbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-gfgnd_openstack(cdd3efe1-119d-408b-a76d-e98ee494dfde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.296234 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.300387 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.300548 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4t8d5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-9fncl_openstack(f58f0c06-356c-4d77-a927-c97077fe8122): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:33:25 crc kubenswrapper[4830]: E0311 09:33:25.303109 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" podUID="f58f0c06-356c-4d77-a927-c97077fe8122" Mar 11 09:33:25 crc kubenswrapper[4830]: I0311 09:33:25.760711 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:33:25 crc kubenswrapper[4830]: I0311 09:33:25.776562 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjsks"] Mar 11 09:33:25 crc kubenswrapper[4830]: W0311 09:33:25.779189 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd97948cc_fc42_46c8_b46e_3f8efdc251db.slice/crio-084d3d82dbe76543ddc5a685e0313e3b8f7f799b9fddf0995b33fab9f73fa365 WatchSource:0}: Error finding container 084d3d82dbe76543ddc5a685e0313e3b8f7f799b9fddf0995b33fab9f73fa365: Status 404 returned error can't find the container with id 084d3d82dbe76543ddc5a685e0313e3b8f7f799b9fddf0995b33fab9f73fa365 Mar 11 09:33:25 crc kubenswrapper[4830]: I0311 09:33:25.821611 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-klr5s"] Mar 11 09:33:25 crc kubenswrapper[4830]: W0311 09:33:25.822815 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97278c10_fe96_4de6_86cf_09ff64444a59.slice/crio-ab0bb308c9e03e6f936cc69e5d3f2146544aa970c96ea2152e3e4bfd7b0511bd WatchSource:0}: Error finding container ab0bb308c9e03e6f936cc69e5d3f2146544aa970c96ea2152e3e4bfd7b0511bd: Status 404 returned error can't find the container with id ab0bb308c9e03e6f936cc69e5d3f2146544aa970c96ea2152e3e4bfd7b0511bd Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.240633 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5731a37-0030-4920-b5c2-ded8262d8e2a","Type":"ContainerStarted","Data":"3518723c18be813832176b3346f3ce971ac34c6747eb2eda4df787b0e2df25c1"} Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.242667 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ce0c7bf-830f-40f3-850f-19b0a879ba23","Type":"ContainerStarted","Data":"3c6ef193a5e924a05d8348f2771e413f64295cca5c6e9e4d942fae25ef1a344f"} Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.245800 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ad5f765-f3dd-42f3-9829-2323ea982c58","Type":"ContainerStarted","Data":"4b86f227f318bb7aa52dc38186b567f467b14fabfa0df58771c61c89411d9af6"} Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.249242 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-klr5s" event={"ID":"97278c10-fe96-4de6-86cf-09ff64444a59","Type":"ContainerStarted","Data":"ab0bb308c9e03e6f936cc69e5d3f2146544aa970c96ea2152e3e4bfd7b0511bd"} Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.251570 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjsks" event={"ID":"d97948cc-fc42-46c8-b46e-3f8efdc251db","Type":"ContainerStarted","Data":"084d3d82dbe76543ddc5a685e0313e3b8f7f799b9fddf0995b33fab9f73fa365"} Mar 11 09:33:26 crc kubenswrapper[4830]: E0311 09:33:26.253415 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" Mar 11 09:33:26 crc kubenswrapper[4830]: E0311 09:33:26.253695 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" podUID="69f894d1-e1f9-4274-9422-0a4dbbb2e8fe" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.711096 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.717327 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.795717 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t8d5\" (UniqueName: \"kubernetes.io/projected/f58f0c06-356c-4d77-a927-c97077fe8122-kube-api-access-4t8d5\") pod \"f58f0c06-356c-4d77-a927-c97077fe8122\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.795911 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn9qn\" (UniqueName: \"kubernetes.io/projected/c6b7e449-011e-4574-b449-fd58ec84aadd-kube-api-access-gn9qn\") pod \"c6b7e449-011e-4574-b449-fd58ec84aadd\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.795961 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-config\") pod \"f58f0c06-356c-4d77-a927-c97077fe8122\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.796136 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6b7e449-011e-4574-b449-fd58ec84aadd-config\") pod \"c6b7e449-011e-4574-b449-fd58ec84aadd\" (UID: \"c6b7e449-011e-4574-b449-fd58ec84aadd\") " Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.796174 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-dns-svc\") pod \"f58f0c06-356c-4d77-a927-c97077fe8122\" (UID: \"f58f0c06-356c-4d77-a927-c97077fe8122\") " Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.796719 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-config" (OuterVolumeSpecName: "config") pod "f58f0c06-356c-4d77-a927-c97077fe8122" (UID: "f58f0c06-356c-4d77-a927-c97077fe8122"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.796743 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6b7e449-011e-4574-b449-fd58ec84aadd-config" (OuterVolumeSpecName: "config") pod "c6b7e449-011e-4574-b449-fd58ec84aadd" (UID: "c6b7e449-011e-4574-b449-fd58ec84aadd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.796741 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f58f0c06-356c-4d77-a927-c97077fe8122" (UID: "f58f0c06-356c-4d77-a927-c97077fe8122"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.801214 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b7e449-011e-4574-b449-fd58ec84aadd-kube-api-access-gn9qn" (OuterVolumeSpecName: "kube-api-access-gn9qn") pod "c6b7e449-011e-4574-b449-fd58ec84aadd" (UID: "c6b7e449-011e-4574-b449-fd58ec84aadd"). InnerVolumeSpecName "kube-api-access-gn9qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.805717 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58f0c06-356c-4d77-a927-c97077fe8122-kube-api-access-4t8d5" (OuterVolumeSpecName: "kube-api-access-4t8d5") pod "f58f0c06-356c-4d77-a927-c97077fe8122" (UID: "f58f0c06-356c-4d77-a927-c97077fe8122"). InnerVolumeSpecName "kube-api-access-4t8d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.898059 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6b7e449-011e-4574-b449-fd58ec84aadd-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.898095 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.898105 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t8d5\" (UniqueName: \"kubernetes.io/projected/f58f0c06-356c-4d77-a927-c97077fe8122-kube-api-access-4t8d5\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.898118 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn9qn\" (UniqueName: \"kubernetes.io/projected/c6b7e449-011e-4574-b449-fd58ec84aadd-kube-api-access-gn9qn\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.898128 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f58f0c06-356c-4d77-a927-c97077fe8122-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:26 crc kubenswrapper[4830]: I0311 09:33:26.947560 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 11 09:33:26 crc kubenswrapper[4830]: W0311 09:33:26.959897 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1424a67_6a71_4943_b855_4795d2427214.slice/crio-97ecf692052fe56f0389f0b13fd21d0ce4afaea4cfe8b9479fef1f78bc66a64d WatchSource:0}: Error finding container 97ecf692052fe56f0389f0b13fd21d0ce4afaea4cfe8b9479fef1f78bc66a64d: Status 404 returned error can't find the container with id 97ecf692052fe56f0389f0b13fd21d0ce4afaea4cfe8b9479fef1f78bc66a64d Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.259641 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.259641 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-4sq7m" event={"ID":"c6b7e449-011e-4574-b449-fd58ec84aadd","Type":"ContainerDied","Data":"77d064bca2b20d397b4766b4d9a1c8a97b4c0432334293ea4884b7617bf757d8"} Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.261355 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.261378 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9fncl" event={"ID":"f58f0c06-356c-4d77-a927-c97077fe8122","Type":"ContainerDied","Data":"95a07e59da5ea2d6b33f66a7c3d9d39015d32ec40425ca1d7bfe596451689b7c"} Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.263719 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a1424a67-6a71-4943-b855-4795d2427214","Type":"ContainerStarted","Data":"97ecf692052fe56f0389f0b13fd21d0ce4afaea4cfe8b9479fef1f78bc66a64d"} Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.266211 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3","Type":"ContainerStarted","Data":"d0f891f9ee5111ec4ec41fb1f690e427848198d851ed49b68aea9751db762add"} Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.267646 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75e77e40-6cb5-47ec-9074-b663b7dba6b4","Type":"ContainerStarted","Data":"14d9a01991262c020d733b7284b2b073581e01cc8c4c54b5086cc048dede37f5"} Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.318401 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9fncl"] Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.338981 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9fncl"] Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.358325 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4sq7m"] Mar 11 09:33:27 crc kubenswrapper[4830]: I0311 09:33:27.368602 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-4sq7m"] Mar 11 09:33:28 crc kubenswrapper[4830]: I0311 09:33:28.946238 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b7e449-011e-4574-b449-fd58ec84aadd" path="/var/lib/kubelet/pods/c6b7e449-011e-4574-b449-fd58ec84aadd/volumes" Mar 11 09:33:28 crc kubenswrapper[4830]: I0311 09:33:28.947089 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58f0c06-356c-4d77-a927-c97077fe8122" path="/var/lib/kubelet/pods/f58f0c06-356c-4d77-a927-c97077fe8122/volumes" Mar 11 09:33:30 crc kubenswrapper[4830]: I0311 09:33:30.287685 4830 generic.go:334] "Generic (PLEG): container finished" podID="9ce0c7bf-830f-40f3-850f-19b0a879ba23" containerID="3c6ef193a5e924a05d8348f2771e413f64295cca5c6e9e4d942fae25ef1a344f" exitCode=0 Mar 11 09:33:30 crc kubenswrapper[4830]: I0311 09:33:30.287725 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ce0c7bf-830f-40f3-850f-19b0a879ba23","Type":"ContainerDied","Data":"3c6ef193a5e924a05d8348f2771e413f64295cca5c6e9e4d942fae25ef1a344f"} Mar 11 09:33:30 crc kubenswrapper[4830]: I0311 09:33:30.290699 4830 generic.go:334] "Generic (PLEG): container finished" podID="7ad5f765-f3dd-42f3-9829-2323ea982c58" containerID="4b86f227f318bb7aa52dc38186b567f467b14fabfa0df58771c61c89411d9af6" exitCode=0 Mar 11 09:33:30 crc kubenswrapper[4830]: I0311 09:33:30.290749 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ad5f765-f3dd-42f3-9829-2323ea982c58","Type":"ContainerDied","Data":"4b86f227f318bb7aa52dc38186b567f467b14fabfa0df58771c61c89411d9af6"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.347474 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7ad5f765-f3dd-42f3-9829-2323ea982c58","Type":"ContainerStarted","Data":"6a4760ee9d841b9539572637da63997e1f2b192abc4ddec4f666507260be6114"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.359415 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8f9d7ab5-467c-4888-8759-6e2ef59957e5","Type":"ContainerStarted","Data":"f6b9bcc3f0b13a78774a4e8206aa4b3f439e2eab2b29a51ecc5b3b1cffff98de"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.362931 4830 generic.go:334] "Generic (PLEG): container finished" podID="97278c10-fe96-4de6-86cf-09ff64444a59" containerID="abd43a1aa01da268deead7889964dc32296ff47f69f048241e634a3c5bac49dc" exitCode=0 Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.363009 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-klr5s" event={"ID":"97278c10-fe96-4de6-86cf-09ff64444a59","Type":"ContainerDied","Data":"abd43a1aa01da268deead7889964dc32296ff47f69f048241e634a3c5bac49dc"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.364920 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjsks" event={"ID":"d97948cc-fc42-46c8-b46e-3f8efdc251db","Type":"ContainerStarted","Data":"78962545ebd6ab8ac4a68490d8eb708073371f21563d85b7645c4570a54b0f02"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.365044 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xjsks" Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.372797 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a1424a67-6a71-4943-b855-4795d2427214","Type":"ContainerStarted","Data":"de280708cb020de80d4b39cfd04d0007cee1fcb7b15f6a4b9a04ddf37e7a0aec"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.374971 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5731a37-0030-4920-b5c2-ded8262d8e2a","Type":"ContainerStarted","Data":"5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.375230 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.378143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9ce0c7bf-830f-40f3-850f-19b0a879ba23","Type":"ContainerStarted","Data":"a7a124c3ecc9e667641d3929768d47d05e5f772b956c6b763ebb3acf0e7ee1a4"} Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.391889 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.480340783 podStartE2EDuration="40.390215297s" podCreationTimestamp="2026-03-11 09:32:55 +0000 UTC" firstStartedPulling="2026-03-11 09:32:57.279525232 +0000 UTC m=+1145.060675921" lastFinishedPulling="2026-03-11 09:33:25.189399746 +0000 UTC m=+1172.970550435" observedRunningTime="2026-03-11 09:33:35.372780195 +0000 UTC m=+1183.153930884" watchObservedRunningTime="2026-03-11 09:33:35.390215297 +0000 UTC m=+1183.171365986" Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.400330 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xjsks" podStartSLOduration=24.457173493 podStartE2EDuration="32.400312578s" podCreationTimestamp="2026-03-11 09:33:03 +0000 UTC" firstStartedPulling="2026-03-11 09:33:25.781313507 +0000 UTC m=+1173.562464196" lastFinishedPulling="2026-03-11 09:33:33.724452592 +0000 UTC m=+1181.505603281" observedRunningTime="2026-03-11 09:33:35.392488591 +0000 UTC m=+1183.173639280" watchObservedRunningTime="2026-03-11 09:33:35.400312578 +0000 UTC m=+1183.181463257" Mar 11 09:33:35 crc kubenswrapper[4830]: I0311 09:33:35.429681 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=26.907727168 podStartE2EDuration="35.429659991s" podCreationTimestamp="2026-03-11 09:33:00 +0000 UTC" firstStartedPulling="2026-03-11 09:33:25.762843384 +0000 UTC m=+1173.543994073" lastFinishedPulling="2026-03-11 09:33:34.284776197 +0000 UTC m=+1182.065926896" observedRunningTime="2026-03-11 09:33:35.427365107 +0000 UTC m=+1183.208515796" watchObservedRunningTime="2026-03-11 09:33:35.429659991 +0000 UTC m=+1183.210810680" Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.387718 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-klr5s" event={"ID":"97278c10-fe96-4de6-86cf-09ff64444a59","Type":"ContainerStarted","Data":"6c46b15175f0d1db361a3b80969b9f9a093949eab60249de65df6d277123df14"} Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.388210 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-klr5s" event={"ID":"97278c10-fe96-4de6-86cf-09ff64444a59","Type":"ContainerStarted","Data":"59af104eba96ee9bffc814d802db180825cb74497c6fd3771968cc41019d1659"} Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.388234 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.388292 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.410430 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-klr5s" podStartSLOduration=25.722339999 podStartE2EDuration="33.410407596s" podCreationTimestamp="2026-03-11 09:33:03 +0000 UTC" firstStartedPulling="2026-03-11 09:33:25.825683546 +0000 UTC m=+1173.606834235" lastFinishedPulling="2026-03-11 09:33:33.513751133 +0000 UTC m=+1181.294901832" observedRunningTime="2026-03-11 09:33:36.405814149 +0000 UTC m=+1184.186964888" watchObservedRunningTime="2026-03-11 09:33:36.410407596 +0000 UTC m=+1184.191558285" Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.412617 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371996.442165 podStartE2EDuration="40.412610917s" podCreationTimestamp="2026-03-11 09:32:56 +0000 UTC" firstStartedPulling="2026-03-11 09:32:58.951496246 +0000 UTC m=+1146.732646935" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:35.454324884 +0000 UTC m=+1183.235475593" watchObservedRunningTime="2026-03-11 09:33:36.412610917 +0000 UTC m=+1184.193761606" Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.612556 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 11 09:33:36 crc kubenswrapper[4830]: I0311 09:33:36.612607 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 11 09:33:38 crc kubenswrapper[4830]: I0311 09:33:38.077958 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 11 09:33:38 crc kubenswrapper[4830]: I0311 09:33:38.078428 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 11 09:33:38 crc kubenswrapper[4830]: I0311 09:33:38.409712 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a1424a67-6a71-4943-b855-4795d2427214","Type":"ContainerStarted","Data":"6d9ca897d9c41ad081cca8fa58ff0da9419f2a3c5f62627ddc535b98471154ab"} Mar 11 09:33:38 crc kubenswrapper[4830]: I0311 09:33:38.412592 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8f9d7ab5-467c-4888-8759-6e2ef59957e5","Type":"ContainerStarted","Data":"f1638e9b5308b797928c9ffb55501e72737926be745ce71c465ca01a48b62409"} Mar 11 09:33:38 crc kubenswrapper[4830]: I0311 09:33:38.438915 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.480349685 podStartE2EDuration="36.438895153s" podCreationTimestamp="2026-03-11 09:33:02 +0000 UTC" firstStartedPulling="2026-03-11 09:33:26.962732412 +0000 UTC m=+1174.743883101" lastFinishedPulling="2026-03-11 09:33:37.92127788 +0000 UTC m=+1185.702428569" observedRunningTime="2026-03-11 09:33:38.431605491 +0000 UTC m=+1186.212756190" watchObservedRunningTime="2026-03-11 09:33:38.438895153 +0000 UTC m=+1186.220045842" Mar 11 09:33:38 crc kubenswrapper[4830]: I0311 09:33:38.459209 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.697985928 podStartE2EDuration="32.459189645s" podCreationTimestamp="2026-03-11 09:33:06 +0000 UTC" firstStartedPulling="2026-03-11 09:33:25.182416302 +0000 UTC m=+1172.963566991" lastFinishedPulling="2026-03-11 09:33:37.943620019 +0000 UTC m=+1185.724770708" observedRunningTime="2026-03-11 09:33:38.45176655 +0000 UTC m=+1186.232917249" watchObservedRunningTime="2026-03-11 09:33:38.459189645 +0000 UTC m=+1186.240340334" Mar 11 09:33:39 crc kubenswrapper[4830]: I0311 09:33:39.123101 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:39 crc kubenswrapper[4830]: I0311 09:33:39.421055 4830 generic.go:334] "Generic (PLEG): container finished" podID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerID="a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6" exitCode=0 Mar 11 09:33:39 crc kubenswrapper[4830]: I0311 09:33:39.421060 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" event={"ID":"cdd3efe1-119d-408b-a76d-e98ee494dfde","Type":"ContainerDied","Data":"a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6"} Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.122499 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.196811 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.414172 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.433322 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" event={"ID":"cdd3efe1-119d-408b-a76d-e98ee494dfde","Type":"ContainerStarted","Data":"3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c"} Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.437043 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.479830 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" podStartSLOduration=3.694583775 podStartE2EDuration="45.479810384s" podCreationTimestamp="2026-03-11 09:32:55 +0000 UTC" firstStartedPulling="2026-03-11 09:32:56.829847266 +0000 UTC m=+1144.610997965" lastFinishedPulling="2026-03-11 09:33:38.615073885 +0000 UTC m=+1186.396224574" observedRunningTime="2026-03-11 09:33:40.460401237 +0000 UTC m=+1188.241551926" watchObservedRunningTime="2026-03-11 09:33:40.479810384 +0000 UTC m=+1188.260961073" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.483521 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.522148 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.756327 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gfgnd"] Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.785561 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-j8pmz"] Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.786764 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.812327 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.815238 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-dp7ql"] Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.816240 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.817954 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.845852 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.854443 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dp7ql"] Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.862608 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.890659 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-config\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.890851 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.890959 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.891121 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnq45\" (UniqueName: \"kubernetes.io/projected/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-kube-api-access-fnq45\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.893192 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-j8pmz"] Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.948654 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993239 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1a868478-8050-4c0f-a7f4-d6dcc82f9832-ovs-rundir\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993359 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a868478-8050-4c0f-a7f4-d6dcc82f9832-config\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993397 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a868478-8050-4c0f-a7f4-d6dcc82f9832-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993431 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-config\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993487 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993538 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1a868478-8050-4c0f-a7f4-d6dcc82f9832-ovn-rundir\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993570 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a868478-8050-4c0f-a7f4-d6dcc82f9832-combined-ca-bundle\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993609 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993695 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnq45\" (UniqueName: \"kubernetes.io/projected/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-kube-api-access-fnq45\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.993730 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhqz5\" (UniqueName: \"kubernetes.io/projected/1a868478-8050-4c0f-a7f4-d6dcc82f9832-kube-api-access-xhqz5\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.995743 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-config\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.996775 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:40 crc kubenswrapper[4830]: I0311 09:33:40.998528 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.019477 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnq45\" (UniqueName: \"kubernetes.io/projected/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-kube-api-access-fnq45\") pod \"dnsmasq-dns-5bf47b49b7-j8pmz\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.089244 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fld8l"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.095493 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhqz5\" (UniqueName: \"kubernetes.io/projected/1a868478-8050-4c0f-a7f4-d6dcc82f9832-kube-api-access-xhqz5\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.095564 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1a868478-8050-4c0f-a7f4-d6dcc82f9832-ovs-rundir\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.095614 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a868478-8050-4c0f-a7f4-d6dcc82f9832-config\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.095643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a868478-8050-4c0f-a7f4-d6dcc82f9832-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.095689 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1a868478-8050-4c0f-a7f4-d6dcc82f9832-ovn-rundir\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.095717 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a868478-8050-4c0f-a7f4-d6dcc82f9832-combined-ca-bundle\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.097120 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a868478-8050-4c0f-a7f4-d6dcc82f9832-config\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.097612 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1a868478-8050-4c0f-a7f4-d6dcc82f9832-ovs-rundir\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.108628 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1a868478-8050-4c0f-a7f4-d6dcc82f9832-ovn-rundir\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.108973 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.110702 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a868478-8050-4c0f-a7f4-d6dcc82f9832-combined-ca-bundle\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.113683 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a868478-8050-4c0f-a7f4-d6dcc82f9832-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.131168 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-mkvnn"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.132853 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.144337 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhqz5\" (UniqueName: \"kubernetes.io/projected/1a868478-8050-4c0f-a7f4-d6dcc82f9832-kube-api-access-xhqz5\") pod \"ovn-controller-metrics-dp7ql\" (UID: \"1a868478-8050-4c0f-a7f4-d6dcc82f9832\") " pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.145279 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.166098 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mkvnn"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.299135 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.299207 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsvj\" (UniqueName: \"kubernetes.io/projected/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-kube-api-access-kdsvj\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.299254 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.299271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-dns-svc\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.299320 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-config\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.364925 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.401855 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-config\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.401966 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.401996 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsvj\" (UniqueName: \"kubernetes.io/projected/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-kube-api-access-kdsvj\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.402036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.402069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-dns-svc\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.403043 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-dns-svc\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.403531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.403738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.404666 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-config\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.426595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsvj\" (UniqueName: \"kubernetes.io/projected/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-kube-api-access-kdsvj\") pod \"dnsmasq-dns-8554648995-mkvnn\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.438330 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-dp7ql" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.457074 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.457348 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" event={"ID":"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe","Type":"ContainerDied","Data":"e064b3cd59c490b2516f1bb349cb707014a2cf22c9c1b5f5f685b2205fe96803"} Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.457432 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-fld8l" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.466249 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"60740879-ec5c-4d1f-bfd0-68ec5e8960f2","Type":"ContainerStarted","Data":"b41e07f56b6a9fc534f57f60951804c898bfde3da0b2d1aa9d029f4189a85323"} Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.467683 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.467742 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.490882 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.187574277 podStartE2EDuration="44.490865359s" podCreationTimestamp="2026-03-11 09:32:57 +0000 UTC" firstStartedPulling="2026-03-11 09:32:59.13027171 +0000 UTC m=+1146.911422389" lastFinishedPulling="2026-03-11 09:33:40.433562792 +0000 UTC m=+1188.214713471" observedRunningTime="2026-03-11 09:33:41.486519229 +0000 UTC m=+1189.267669918" watchObservedRunningTime="2026-03-11 09:33:41.490865359 +0000 UTC m=+1189.272016048" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.503501 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttql2\" (UniqueName: \"kubernetes.io/projected/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-kube-api-access-ttql2\") pod \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.503566 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-config\") pod \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.503615 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-dns-svc\") pod \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\" (UID: \"69f894d1-e1f9-4274-9422-0a4dbbb2e8fe\") " Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.504053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-config" (OuterVolumeSpecName: "config") pod "69f894d1-e1f9-4274-9422-0a4dbbb2e8fe" (UID: "69f894d1-e1f9-4274-9422-0a4dbbb2e8fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.504414 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69f894d1-e1f9-4274-9422-0a4dbbb2e8fe" (UID: "69f894d1-e1f9-4274-9422-0a4dbbb2e8fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.507055 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-kube-api-access-ttql2" (OuterVolumeSpecName: "kube-api-access-ttql2") pod "69f894d1-e1f9-4274-9422-0a4dbbb2e8fe" (UID: "69f894d1-e1f9-4274-9422-0a4dbbb2e8fe"). InnerVolumeSpecName "kube-api-access-ttql2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.524078 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.610428 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttql2\" (UniqueName: \"kubernetes.io/projected/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-kube-api-access-ttql2\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.611533 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.611545 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.699167 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.701970 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.720556 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.720904 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.723835 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r5lqv" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.738173 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.749806 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.761655 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-j8pmz"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.822348 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.822415 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-scripts\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.822442 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-config\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.822505 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbwbs\" (UniqueName: \"kubernetes.io/projected/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-kube-api-access-wbwbs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.822536 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.822582 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.822682 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.866426 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fld8l"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.871613 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-fld8l"] Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.924146 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.924271 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.924314 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-scripts\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.924335 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-config\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.924379 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbwbs\" (UniqueName: \"kubernetes.io/projected/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-kube-api-access-wbwbs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.924427 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.924464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.931182 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-scripts\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.931487 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.931822 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-config\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.933569 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.939643 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.941394 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.950796 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbwbs\" (UniqueName: \"kubernetes.io/projected/af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0-kube-api-access-wbwbs\") pod \"ovn-northd-0\" (UID: \"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0\") " pod="openstack/ovn-northd-0" Mar 11 09:33:41 crc kubenswrapper[4830]: I0311 09:33:41.986620 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-dp7ql"] Mar 11 09:33:42 crc kubenswrapper[4830]: W0311 09:33:42.057975 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a868478_8050_4c0f_a7f4_d6dcc82f9832.slice/crio-58918142fb8f454e1ffc65a0e03836801798aab4c4d9c071d78c7168caa2652f WatchSource:0}: Error finding container 58918142fb8f454e1ffc65a0e03836801798aab4c4d9c071d78c7168caa2652f: Status 404 returned error can't find the container with id 58918142fb8f454e1ffc65a0e03836801798aab4c4d9c071d78c7168caa2652f Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.066349 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mkvnn"] Mar 11 09:33:42 crc kubenswrapper[4830]: W0311 09:33:42.068122 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac5c494a_d5f6_4baa_8ed1_c8c4711bb36a.slice/crio-618ce78324a8af33895588acaf10d15bbc911420aae3b44fe7ee4dda42ae0afe WatchSource:0}: Error finding container 618ce78324a8af33895588acaf10d15bbc911420aae3b44fe7ee4dda42ae0afe: Status 404 returned error can't find the container with id 618ce78324a8af33895588acaf10d15bbc911420aae3b44fe7ee4dda42ae0afe Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.116295 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.398878 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 11 09:33:42 crc kubenswrapper[4830]: W0311 09:33:42.403544 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf5dcb58_54a3_4ca2_a5b5_0a04bea6e6e0.slice/crio-875d7aa1ca6cc4b6f72394af49c03892a2a16f7282ec34e255d80347f9574266 WatchSource:0}: Error finding container 875d7aa1ca6cc4b6f72394af49c03892a2a16f7282ec34e255d80347f9574266: Status 404 returned error can't find the container with id 875d7aa1ca6cc4b6f72394af49c03892a2a16f7282ec34e255d80347f9574266 Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.477072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dp7ql" event={"ID":"1a868478-8050-4c0f-a7f4-d6dcc82f9832","Type":"ContainerStarted","Data":"40307b21f8479e1da685149c237f0b9b96f2874d26d855edd434bcb40653ee12"} Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.477199 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-dp7ql" event={"ID":"1a868478-8050-4c0f-a7f4-d6dcc82f9832","Type":"ContainerStarted","Data":"58918142fb8f454e1ffc65a0e03836801798aab4c4d9c071d78c7168caa2652f"} Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.481990 4830 generic.go:334] "Generic (PLEG): container finished" podID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerID="2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f" exitCode=0 Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.482105 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" event={"ID":"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1","Type":"ContainerDied","Data":"2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f"} Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.482139 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" event={"ID":"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1","Type":"ContainerStarted","Data":"7792fc8b91c2f137df802e987196e89b26a1c9367c16d0d900f4593c380edd8d"} Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.484601 4830 generic.go:334] "Generic (PLEG): container finished" podID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerID="ca30c7c2289642cbc846c4646c59f7b1e517d9de4b346c451bb4e5c0b50e8841" exitCode=0 Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.484633 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mkvnn" event={"ID":"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a","Type":"ContainerDied","Data":"ca30c7c2289642cbc846c4646c59f7b1e517d9de4b346c451bb4e5c0b50e8841"} Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.484688 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mkvnn" event={"ID":"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a","Type":"ContainerStarted","Data":"618ce78324a8af33895588acaf10d15bbc911420aae3b44fe7ee4dda42ae0afe"} Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.485610 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0","Type":"ContainerStarted","Data":"875d7aa1ca6cc4b6f72394af49c03892a2a16f7282ec34e255d80347f9574266"} Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.486243 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerName="dnsmasq-dns" containerID="cri-o://3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c" gracePeriod=10 Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.530805 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-dp7ql" podStartSLOduration=2.530785634 podStartE2EDuration="2.530785634s" podCreationTimestamp="2026-03-11 09:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:42.504985919 +0000 UTC m=+1190.286136618" watchObservedRunningTime="2026-03-11 09:33:42.530785634 +0000 UTC m=+1190.311936323" Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.864965 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.946914 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f894d1-e1f9-4274-9422-0a4dbbb2e8fe" path="/var/lib/kubelet/pods/69f894d1-e1f9-4274-9422-0a4dbbb2e8fe/volumes" Mar 11 09:33:42 crc kubenswrapper[4830]: I0311 09:33:42.979759 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.060982 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.061076 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.074973 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.184297 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-config\") pod \"cdd3efe1-119d-408b-a76d-e98ee494dfde\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.184394 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tthbs\" (UniqueName: \"kubernetes.io/projected/cdd3efe1-119d-408b-a76d-e98ee494dfde-kube-api-access-tthbs\") pod \"cdd3efe1-119d-408b-a76d-e98ee494dfde\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.184423 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-dns-svc\") pod \"cdd3efe1-119d-408b-a76d-e98ee494dfde\" (UID: \"cdd3efe1-119d-408b-a76d-e98ee494dfde\") " Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.190670 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd3efe1-119d-408b-a76d-e98ee494dfde-kube-api-access-tthbs" (OuterVolumeSpecName: "kube-api-access-tthbs") pod "cdd3efe1-119d-408b-a76d-e98ee494dfde" (UID: "cdd3efe1-119d-408b-a76d-e98ee494dfde"). InnerVolumeSpecName "kube-api-access-tthbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.229082 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-config" (OuterVolumeSpecName: "config") pod "cdd3efe1-119d-408b-a76d-e98ee494dfde" (UID: "cdd3efe1-119d-408b-a76d-e98ee494dfde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.239618 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cdd3efe1-119d-408b-a76d-e98ee494dfde" (UID: "cdd3efe1-119d-408b-a76d-e98ee494dfde"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.285871 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.285915 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tthbs\" (UniqueName: \"kubernetes.io/projected/cdd3efe1-119d-408b-a76d-e98ee494dfde-kube-api-access-tthbs\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.285931 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cdd3efe1-119d-408b-a76d-e98ee494dfde-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.498901 4830 generic.go:334] "Generic (PLEG): container finished" podID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerID="3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c" exitCode=0 Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.498966 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" event={"ID":"cdd3efe1-119d-408b-a76d-e98ee494dfde","Type":"ContainerDied","Data":"3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c"} Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.498998 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" event={"ID":"cdd3efe1-119d-408b-a76d-e98ee494dfde","Type":"ContainerDied","Data":"b82d11e70f70028b774f5dbeb9ded6fd651da16a87c45ec84977728fb98166b1"} Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.499028 4830 scope.go:117] "RemoveContainer" containerID="3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.499066 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gfgnd" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.504246 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" event={"ID":"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1","Type":"ContainerStarted","Data":"c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846"} Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.504431 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.511912 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mkvnn" event={"ID":"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a","Type":"ContainerStarted","Data":"4d82246f8c771b8d1aa4957b6eedca363fe47c836b57dbd8c5d503d49ccb1070"} Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.511951 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.527157 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" podStartSLOduration=3.527138222 podStartE2EDuration="3.527138222s" podCreationTimestamp="2026-03-11 09:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:43.522669698 +0000 UTC m=+1191.303820427" watchObservedRunningTime="2026-03-11 09:33:43.527138222 +0000 UTC m=+1191.308288901" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.548826 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-mkvnn" podStartSLOduration=2.548803542 podStartE2EDuration="2.548803542s" podCreationTimestamp="2026-03-11 09:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:43.542329993 +0000 UTC m=+1191.323480702" watchObservedRunningTime="2026-03-11 09:33:43.548803542 +0000 UTC m=+1191.329954231" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.561447 4830 scope.go:117] "RemoveContainer" containerID="a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.564559 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gfgnd"] Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.569316 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gfgnd"] Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.588341 4830 scope.go:117] "RemoveContainer" containerID="3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c" Mar 11 09:33:43 crc kubenswrapper[4830]: E0311 09:33:43.588993 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c\": container with ID starting with 3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c not found: ID does not exist" containerID="3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.589067 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c"} err="failed to get container status \"3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c\": rpc error: code = NotFound desc = could not find container \"3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c\": container with ID starting with 3d4b1c63354f7c12836cf223c83e34e7959ae8ff8466b88b5ea049d91f69bf1c not found: ID does not exist" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.589098 4830 scope.go:117] "RemoveContainer" containerID="a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6" Mar 11 09:33:43 crc kubenswrapper[4830]: E0311 09:33:43.590007 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6\": container with ID starting with a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6 not found: ID does not exist" containerID="a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6" Mar 11 09:33:43 crc kubenswrapper[4830]: I0311 09:33:43.590112 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6"} err="failed to get container status \"a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6\": rpc error: code = NotFound desc = could not find container \"a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6\": container with ID starting with a745e309c01c29162bce8a0c39c2d25ddd705727d72e7f59a6c2c579c65d10f6 not found: ID does not exist" Mar 11 09:33:44 crc kubenswrapper[4830]: I0311 09:33:44.519402 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0","Type":"ContainerStarted","Data":"c59e552f633753fe7a81abf80db772ead39634b5d9941def8d1e7f0c377b1b60"} Mar 11 09:33:44 crc kubenswrapper[4830]: I0311 09:33:44.519759 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0","Type":"ContainerStarted","Data":"1f3fe3fcc58f878c99de05f5524930c5a3de1713b7c4f8a64c136f286cdcd1dd"} Mar 11 09:33:44 crc kubenswrapper[4830]: I0311 09:33:44.519776 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 11 09:33:44 crc kubenswrapper[4830]: I0311 09:33:44.544264 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.270062678 podStartE2EDuration="3.544247674s" podCreationTimestamp="2026-03-11 09:33:41 +0000 UTC" firstStartedPulling="2026-03-11 09:33:42.407269502 +0000 UTC m=+1190.188420191" lastFinishedPulling="2026-03-11 09:33:43.681454498 +0000 UTC m=+1191.462605187" observedRunningTime="2026-03-11 09:33:44.535707908 +0000 UTC m=+1192.316858597" watchObservedRunningTime="2026-03-11 09:33:44.544247674 +0000 UTC m=+1192.325398363" Mar 11 09:33:44 crc kubenswrapper[4830]: I0311 09:33:44.942261 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" path="/var/lib/kubelet/pods/cdd3efe1-119d-408b-a76d-e98ee494dfde/volumes" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.301512 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-w4w6w"] Mar 11 09:33:45 crc kubenswrapper[4830]: E0311 09:33:45.301954 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerName="init" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.301977 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerName="init" Mar 11 09:33:45 crc kubenswrapper[4830]: E0311 09:33:45.302092 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerName="dnsmasq-dns" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.302103 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerName="dnsmasq-dns" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.302326 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd3efe1-119d-408b-a76d-e98ee494dfde" containerName="dnsmasq-dns" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.302997 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.304868 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.308159 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w4w6w"] Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.424709 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdkln\" (UniqueName: \"kubernetes.io/projected/72de00a2-a84b-46ad-9830-d880367fea73-kube-api-access-jdkln\") pod \"root-account-create-update-w4w6w\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.424770 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72de00a2-a84b-46ad-9830-d880367fea73-operator-scripts\") pod \"root-account-create-update-w4w6w\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.528275 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdkln\" (UniqueName: \"kubernetes.io/projected/72de00a2-a84b-46ad-9830-d880367fea73-kube-api-access-jdkln\") pod \"root-account-create-update-w4w6w\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.528352 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72de00a2-a84b-46ad-9830-d880367fea73-operator-scripts\") pod \"root-account-create-update-w4w6w\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.529217 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72de00a2-a84b-46ad-9830-d880367fea73-operator-scripts\") pod \"root-account-create-update-w4w6w\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.559554 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdkln\" (UniqueName: \"kubernetes.io/projected/72de00a2-a84b-46ad-9830-d880367fea73-kube-api-access-jdkln\") pod \"root-account-create-update-w4w6w\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:45 crc kubenswrapper[4830]: I0311 09:33:45.619220 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:46 crc kubenswrapper[4830]: I0311 09:33:46.075992 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w4w6w"] Mar 11 09:33:46 crc kubenswrapper[4830]: W0311 09:33:46.079545 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72de00a2_a84b_46ad_9830_d880367fea73.slice/crio-540a501d94f2be8fda2416c966f645ad30a5c4e179a2b28bfca1d7b99b667d71 WatchSource:0}: Error finding container 540a501d94f2be8fda2416c966f645ad30a5c4e179a2b28bfca1d7b99b667d71: Status 404 returned error can't find the container with id 540a501d94f2be8fda2416c966f645ad30a5c4e179a2b28bfca1d7b99b667d71 Mar 11 09:33:46 crc kubenswrapper[4830]: I0311 09:33:46.538383 4830 generic.go:334] "Generic (PLEG): container finished" podID="72de00a2-a84b-46ad-9830-d880367fea73" containerID="77716cbaf234aa7148b92e8585531c510c9e63187c20a1e92cb8512464b65857" exitCode=0 Mar 11 09:33:46 crc kubenswrapper[4830]: I0311 09:33:46.538424 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w4w6w" event={"ID":"72de00a2-a84b-46ad-9830-d880367fea73","Type":"ContainerDied","Data":"77716cbaf234aa7148b92e8585531c510c9e63187c20a1e92cb8512464b65857"} Mar 11 09:33:46 crc kubenswrapper[4830]: I0311 09:33:46.538446 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w4w6w" event={"ID":"72de00a2-a84b-46ad-9830-d880367fea73","Type":"ContainerStarted","Data":"540a501d94f2be8fda2416c966f645ad30a5c4e179a2b28bfca1d7b99b667d71"} Mar 11 09:33:47 crc kubenswrapper[4830]: I0311 09:33:47.846917 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.002814 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdkln\" (UniqueName: \"kubernetes.io/projected/72de00a2-a84b-46ad-9830-d880367fea73-kube-api-access-jdkln\") pod \"72de00a2-a84b-46ad-9830-d880367fea73\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.002944 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72de00a2-a84b-46ad-9830-d880367fea73-operator-scripts\") pod \"72de00a2-a84b-46ad-9830-d880367fea73\" (UID: \"72de00a2-a84b-46ad-9830-d880367fea73\") " Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.003897 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72de00a2-a84b-46ad-9830-d880367fea73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72de00a2-a84b-46ad-9830-d880367fea73" (UID: "72de00a2-a84b-46ad-9830-d880367fea73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.024284 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72de00a2-a84b-46ad-9830-d880367fea73-kube-api-access-jdkln" (OuterVolumeSpecName: "kube-api-access-jdkln") pod "72de00a2-a84b-46ad-9830-d880367fea73" (UID: "72de00a2-a84b-46ad-9830-d880367fea73"). InnerVolumeSpecName "kube-api-access-jdkln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.104981 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72de00a2-a84b-46ad-9830-d880367fea73-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.105037 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdkln\" (UniqueName: \"kubernetes.io/projected/72de00a2-a84b-46ad-9830-d880367fea73-kube-api-access-jdkln\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.224006 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.420931 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-p7nmp"] Mar 11 09:33:48 crc kubenswrapper[4830]: E0311 09:33:48.421239 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72de00a2-a84b-46ad-9830-d880367fea73" containerName="mariadb-account-create-update" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.421259 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="72de00a2-a84b-46ad-9830-d880367fea73" containerName="mariadb-account-create-update" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.421417 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="72de00a2-a84b-46ad-9830-d880367fea73" containerName="mariadb-account-create-update" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.421928 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.431162 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p7nmp"] Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.512055 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42ddef2-86d1-435e-b240-f159db41a2d2-operator-scripts\") pod \"glance-db-create-p7nmp\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.512098 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgcl\" (UniqueName: \"kubernetes.io/projected/a42ddef2-86d1-435e-b240-f159db41a2d2-kube-api-access-7wgcl\") pod \"glance-db-create-p7nmp\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.518922 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c667-account-create-update-cpknz"] Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.519999 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.522241 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.534729 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c667-account-create-update-cpknz"] Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.555788 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w4w6w" event={"ID":"72de00a2-a84b-46ad-9830-d880367fea73","Type":"ContainerDied","Data":"540a501d94f2be8fda2416c966f645ad30a5c4e179a2b28bfca1d7b99b667d71"} Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.555837 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="540a501d94f2be8fda2416c966f645ad30a5c4e179a2b28bfca1d7b99b667d71" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.555907 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w4w6w" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.613302 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ms9k\" (UniqueName: \"kubernetes.io/projected/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-kube-api-access-5ms9k\") pod \"glance-c667-account-create-update-cpknz\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.613363 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42ddef2-86d1-435e-b240-f159db41a2d2-operator-scripts\") pod \"glance-db-create-p7nmp\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.613384 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgcl\" (UniqueName: \"kubernetes.io/projected/a42ddef2-86d1-435e-b240-f159db41a2d2-kube-api-access-7wgcl\") pod \"glance-db-create-p7nmp\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.613433 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-operator-scripts\") pod \"glance-c667-account-create-update-cpknz\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.614189 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42ddef2-86d1-435e-b240-f159db41a2d2-operator-scripts\") pod \"glance-db-create-p7nmp\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.632338 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgcl\" (UniqueName: \"kubernetes.io/projected/a42ddef2-86d1-435e-b240-f159db41a2d2-kube-api-access-7wgcl\") pod \"glance-db-create-p7nmp\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.714445 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-operator-scripts\") pod \"glance-c667-account-create-update-cpknz\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.714542 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ms9k\" (UniqueName: \"kubernetes.io/projected/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-kube-api-access-5ms9k\") pod \"glance-c667-account-create-update-cpknz\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.715171 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-operator-scripts\") pod \"glance-c667-account-create-update-cpknz\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.730704 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ms9k\" (UniqueName: \"kubernetes.io/projected/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-kube-api-access-5ms9k\") pod \"glance-c667-account-create-update-cpknz\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.744445 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:48 crc kubenswrapper[4830]: I0311 09:33:48.836616 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.165586 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p7nmp"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.286690 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c667-account-create-update-cpknz"] Mar 11 09:33:49 crc kubenswrapper[4830]: W0311 09:33:49.288735 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b71ab87_579d_41fb_b094_92cc4e3fe5b6.slice/crio-7147d0638bd9f2e83c19be46ce2248f3cc43f1a37a87a130f3a11afb0719eff3 WatchSource:0}: Error finding container 7147d0638bd9f2e83c19be46ce2248f3cc43f1a37a87a130f3a11afb0719eff3: Status 404 returned error can't find the container with id 7147d0638bd9f2e83c19be46ce2248f3cc43f1a37a87a130f3a11afb0719eff3 Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.345371 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mmh84"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.346400 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.354640 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mmh84"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.446273 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dd45-account-create-update-8gxtl"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.447515 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.449149 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.455674 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dd45-account-create-update-8gxtl"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.526280 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pnhw\" (UniqueName: \"kubernetes.io/projected/76a18449-bdae-4eab-a9f8-816cf3064929-kube-api-access-4pnhw\") pod \"keystone-db-create-mmh84\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.526650 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a18449-bdae-4eab-a9f8-816cf3064929-operator-scripts\") pod \"keystone-db-create-mmh84\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.563953 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c667-account-create-update-cpknz" event={"ID":"0b71ab87-579d-41fb-b094-92cc4e3fe5b6","Type":"ContainerStarted","Data":"78d9cabea547fb73b5971d1d293cd333fe8bc2c62af293ba0d803ed1337258be"} Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.564298 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c667-account-create-update-cpknz" event={"ID":"0b71ab87-579d-41fb-b094-92cc4e3fe5b6","Type":"ContainerStarted","Data":"7147d0638bd9f2e83c19be46ce2248f3cc43f1a37a87a130f3a11afb0719eff3"} Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.567724 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7nmp" event={"ID":"a42ddef2-86d1-435e-b240-f159db41a2d2","Type":"ContainerStarted","Data":"f5853cc6f292f2f21212e4bae7ef4a727a593062e600d0aa5a83510a873757ca"} Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.567952 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7nmp" event={"ID":"a42ddef2-86d1-435e-b240-f159db41a2d2","Type":"ContainerStarted","Data":"0e04559d21b452407c9424b56ed4e50c015989a0b0e067f322a39d4365c849b4"} Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.617366 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-nvf88"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.619374 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-c667-account-create-update-cpknz" podStartSLOduration=1.6193537199999999 podStartE2EDuration="1.61935372s" podCreationTimestamp="2026-03-11 09:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:49.585897343 +0000 UTC m=+1197.367048052" watchObservedRunningTime="2026-03-11 09:33:49.61935372 +0000 UTC m=+1197.400504409" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.620001 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nvf88" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.630190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-operator-scripts\") pod \"keystone-dd45-account-create-update-8gxtl\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.630241 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bzmg\" (UniqueName: \"kubernetes.io/projected/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-kube-api-access-9bzmg\") pod \"keystone-dd45-account-create-update-8gxtl\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.630293 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pnhw\" (UniqueName: \"kubernetes.io/projected/76a18449-bdae-4eab-a9f8-816cf3064929-kube-api-access-4pnhw\") pod \"keystone-db-create-mmh84\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.630312 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a18449-bdae-4eab-a9f8-816cf3064929-operator-scripts\") pod \"keystone-db-create-mmh84\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.633810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a18449-bdae-4eab-a9f8-816cf3064929-operator-scripts\") pod \"keystone-db-create-mmh84\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.644166 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-nvf88"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.647976 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-p7nmp" podStartSLOduration=1.647947572 podStartE2EDuration="1.647947572s" podCreationTimestamp="2026-03-11 09:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:49.61068645 +0000 UTC m=+1197.391837139" watchObservedRunningTime="2026-03-11 09:33:49.647947572 +0000 UTC m=+1197.429098261" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.656959 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pnhw\" (UniqueName: \"kubernetes.io/projected/76a18449-bdae-4eab-a9f8-816cf3064929-kube-api-access-4pnhw\") pod \"keystone-db-create-mmh84\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.674892 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.731399 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8f302b2-d44c-4598-89d7-1592d4e14f6d-operator-scripts\") pod \"placement-db-create-nvf88\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " pod="openstack/placement-db-create-nvf88" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.731573 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-operator-scripts\") pod \"keystone-dd45-account-create-update-8gxtl\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.731616 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bzmg\" (UniqueName: \"kubernetes.io/projected/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-kube-api-access-9bzmg\") pod \"keystone-dd45-account-create-update-8gxtl\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.731672 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxvv2\" (UniqueName: \"kubernetes.io/projected/f8f302b2-d44c-4598-89d7-1592d4e14f6d-kube-api-access-qxvv2\") pod \"placement-db-create-nvf88\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " pod="openstack/placement-db-create-nvf88" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.733115 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-operator-scripts\") pod \"keystone-dd45-account-create-update-8gxtl\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.753622 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bzmg\" (UniqueName: \"kubernetes.io/projected/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-kube-api-access-9bzmg\") pod \"keystone-dd45-account-create-update-8gxtl\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.761129 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-125c-account-create-update-rnfqn"] Mar 11 09:33:49 crc kubenswrapper[4830]: E0311 09:33:49.761502 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda42ddef2_86d1_435e_b240_f159db41a2d2.slice/crio-conmon-f5853cc6f292f2f21212e4bae7ef4a727a593062e600d0aa5a83510a873757ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda42ddef2_86d1_435e_b240_f159db41a2d2.slice/crio-f5853cc6f292f2f21212e4bae7ef4a727a593062e600d0aa5a83510a873757ca.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.762915 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.763942 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.765683 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.773852 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-125c-account-create-update-rnfqn"] Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.834352 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxvv2\" (UniqueName: \"kubernetes.io/projected/f8f302b2-d44c-4598-89d7-1592d4e14f6d-kube-api-access-qxvv2\") pod \"placement-db-create-nvf88\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " pod="openstack/placement-db-create-nvf88" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.835899 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8f302b2-d44c-4598-89d7-1592d4e14f6d-operator-scripts\") pod \"placement-db-create-nvf88\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " pod="openstack/placement-db-create-nvf88" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.837498 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8f302b2-d44c-4598-89d7-1592d4e14f6d-operator-scripts\") pod \"placement-db-create-nvf88\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " pod="openstack/placement-db-create-nvf88" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.857826 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxvv2\" (UniqueName: \"kubernetes.io/projected/f8f302b2-d44c-4598-89d7-1592d4e14f6d-kube-api-access-qxvv2\") pod \"placement-db-create-nvf88\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " pod="openstack/placement-db-create-nvf88" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.939440 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6497d6f-b364-4d49-8825-08dd005ef1a1-operator-scripts\") pod \"placement-125c-account-create-update-rnfqn\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.939562 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w45v\" (UniqueName: \"kubernetes.io/projected/d6497d6f-b364-4d49-8825-08dd005ef1a1-kube-api-access-7w45v\") pod \"placement-125c-account-create-update-rnfqn\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:49 crc kubenswrapper[4830]: I0311 09:33:49.970646 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nvf88" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.042508 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6497d6f-b364-4d49-8825-08dd005ef1a1-operator-scripts\") pod \"placement-125c-account-create-update-rnfqn\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.042625 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w45v\" (UniqueName: \"kubernetes.io/projected/d6497d6f-b364-4d49-8825-08dd005ef1a1-kube-api-access-7w45v\") pod \"placement-125c-account-create-update-rnfqn\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.044009 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6497d6f-b364-4d49-8825-08dd005ef1a1-operator-scripts\") pod \"placement-125c-account-create-update-rnfqn\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.062814 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w45v\" (UniqueName: \"kubernetes.io/projected/d6497d6f-b364-4d49-8825-08dd005ef1a1-kube-api-access-7w45v\") pod \"placement-125c-account-create-update-rnfqn\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.203402 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mmh84"] Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.223065 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.331943 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dd45-account-create-update-8gxtl"] Mar 11 09:33:50 crc kubenswrapper[4830]: W0311 09:33:50.347170 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ea8a5a_b51d_4cec_a7b6_c83a3fdc428b.slice/crio-0aefcdd235ea0c567865982b3e032e6c8d0e1029be3bcdaccec7205dfad988c1 WatchSource:0}: Error finding container 0aefcdd235ea0c567865982b3e032e6c8d0e1029be3bcdaccec7205dfad988c1: Status 404 returned error can't find the container with id 0aefcdd235ea0c567865982b3e032e6c8d0e1029be3bcdaccec7205dfad988c1 Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.441770 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-nvf88"] Mar 11 09:33:50 crc kubenswrapper[4830]: W0311 09:33:50.467378 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8f302b2_d44c_4598_89d7_1592d4e14f6d.slice/crio-a80c0d5c09c262444ba93738b38c1f69399df6d0f2968165c301b14e27145da9 WatchSource:0}: Error finding container a80c0d5c09c262444ba93738b38c1f69399df6d0f2968165c301b14e27145da9: Status 404 returned error can't find the container with id a80c0d5c09c262444ba93738b38c1f69399df6d0f2968165c301b14e27145da9 Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.593430 4830 generic.go:334] "Generic (PLEG): container finished" podID="0b71ab87-579d-41fb-b094-92cc4e3fe5b6" containerID="78d9cabea547fb73b5971d1d293cd333fe8bc2c62af293ba0d803ed1337258be" exitCode=0 Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.593828 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c667-account-create-update-cpknz" event={"ID":"0b71ab87-579d-41fb-b094-92cc4e3fe5b6","Type":"ContainerDied","Data":"78d9cabea547fb73b5971d1d293cd333fe8bc2c62af293ba0d803ed1337258be"} Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.598386 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmh84" event={"ID":"76a18449-bdae-4eab-a9f8-816cf3064929","Type":"ContainerStarted","Data":"2b73748f10aa4dae27415bfa62deab42b8e1b2593deec5e6d6ae2eae42751d59"} Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.598419 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmh84" event={"ID":"76a18449-bdae-4eab-a9f8-816cf3064929","Type":"ContainerStarted","Data":"15532c291e3d1ee8628b95ba4d2921c2be7212bbbd075d3ddaf7832ebb2651b2"} Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.602652 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd45-account-create-update-8gxtl" event={"ID":"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b","Type":"ContainerStarted","Data":"e88371dc46ca1125d9d23340678b6e346e1f5327d6b895fe1455cd7c6fa7e8fc"} Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.602701 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd45-account-create-update-8gxtl" event={"ID":"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b","Type":"ContainerStarted","Data":"0aefcdd235ea0c567865982b3e032e6c8d0e1029be3bcdaccec7205dfad988c1"} Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.606096 4830 generic.go:334] "Generic (PLEG): container finished" podID="a42ddef2-86d1-435e-b240-f159db41a2d2" containerID="f5853cc6f292f2f21212e4bae7ef4a727a593062e600d0aa5a83510a873757ca" exitCode=0 Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.606152 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7nmp" event={"ID":"a42ddef2-86d1-435e-b240-f159db41a2d2","Type":"ContainerDied","Data":"f5853cc6f292f2f21212e4bae7ef4a727a593062e600d0aa5a83510a873757ca"} Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.606980 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-nvf88" event={"ID":"f8f302b2-d44c-4598-89d7-1592d4e14f6d","Type":"ContainerStarted","Data":"a80c0d5c09c262444ba93738b38c1f69399df6d0f2968165c301b14e27145da9"} Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.639399 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-mmh84" podStartSLOduration=1.6393770540000001 podStartE2EDuration="1.639377054s" podCreationTimestamp="2026-03-11 09:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:50.638180601 +0000 UTC m=+1198.419331310" watchObservedRunningTime="2026-03-11 09:33:50.639377054 +0000 UTC m=+1198.420527743" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.660722 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-dd45-account-create-update-8gxtl" podStartSLOduration=1.660706485 podStartE2EDuration="1.660706485s" podCreationTimestamp="2026-03-11 09:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:50.658438922 +0000 UTC m=+1198.439589611" watchObservedRunningTime="2026-03-11 09:33:50.660706485 +0000 UTC m=+1198.441857174" Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.704386 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-125c-account-create-update-rnfqn"] Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.988964 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-j8pmz"] Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.989464 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerName="dnsmasq-dns" containerID="cri-o://c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846" gracePeriod=10 Mar 11 09:33:50 crc kubenswrapper[4830]: I0311 09:33:50.993258 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.012182 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-4hkw2"] Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.027605 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.028850 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-4hkw2"] Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.110273 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.185153 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.185217 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.185300 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6slm\" (UniqueName: \"kubernetes.io/projected/6d65cbea-4a39-478f-91b1-60e6dd72b135-kube-api-access-f6slm\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.186781 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.186859 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-config\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.288336 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-config\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.288451 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.288479 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.288538 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6slm\" (UniqueName: \"kubernetes.io/projected/6d65cbea-4a39-478f-91b1-60e6dd72b135-kube-api-access-f6slm\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.288558 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.289340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-config\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.289358 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.289905 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.290471 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.310220 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6slm\" (UniqueName: \"kubernetes.io/projected/6d65cbea-4a39-478f-91b1-60e6dd72b135-kube-api-access-f6slm\") pod \"dnsmasq-dns-b8fbc5445-4hkw2\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.457318 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.460676 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.479567 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-w4w6w"] Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.508198 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-w4w6w"] Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.554596 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.635170 4830 generic.go:334] "Generic (PLEG): container finished" podID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerID="c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846" exitCode=0 Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.635243 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" event={"ID":"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1","Type":"ContainerDied","Data":"c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846"} Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.635275 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" event={"ID":"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1","Type":"ContainerDied","Data":"7792fc8b91c2f137df802e987196e89b26a1c9367c16d0d900f4593c380edd8d"} Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.635294 4830 scope.go:117] "RemoveContainer" containerID="c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.635430 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-j8pmz" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.644675 4830 generic.go:334] "Generic (PLEG): container finished" podID="76a18449-bdae-4eab-a9f8-816cf3064929" containerID="2b73748f10aa4dae27415bfa62deab42b8e1b2593deec5e6d6ae2eae42751d59" exitCode=0 Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.645433 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmh84" event={"ID":"76a18449-bdae-4eab-a9f8-816cf3064929","Type":"ContainerDied","Data":"2b73748f10aa4dae27415bfa62deab42b8e1b2593deec5e6d6ae2eae42751d59"} Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.653036 4830 generic.go:334] "Generic (PLEG): container finished" podID="88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b" containerID="e88371dc46ca1125d9d23340678b6e346e1f5327d6b895fe1455cd7c6fa7e8fc" exitCode=0 Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.653116 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd45-account-create-update-8gxtl" event={"ID":"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b","Type":"ContainerDied","Data":"e88371dc46ca1125d9d23340678b6e346e1f5327d6b895fe1455cd7c6fa7e8fc"} Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.654849 4830 generic.go:334] "Generic (PLEG): container finished" podID="d6497d6f-b364-4d49-8825-08dd005ef1a1" containerID="f6a94db2e611365246e47462983d5ebdd1beadbe5c1b6c45eecb86efb40ba8e1" exitCode=0 Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.654943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-125c-account-create-update-rnfqn" event={"ID":"d6497d6f-b364-4d49-8825-08dd005ef1a1","Type":"ContainerDied","Data":"f6a94db2e611365246e47462983d5ebdd1beadbe5c1b6c45eecb86efb40ba8e1"} Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.655003 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-125c-account-create-update-rnfqn" event={"ID":"d6497d6f-b364-4d49-8825-08dd005ef1a1","Type":"ContainerStarted","Data":"45d448324966a8c8090615425117569a4b1df0244c37d2eff00c41c3f1d86b9d"} Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.656596 4830 generic.go:334] "Generic (PLEG): container finished" podID="f8f302b2-d44c-4598-89d7-1592d4e14f6d" containerID="969c747b2572a12bbb9f3852d36e59f19d952563c3924cb9adb87ac12691f447" exitCode=0 Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.656693 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-nvf88" event={"ID":"f8f302b2-d44c-4598-89d7-1592d4e14f6d","Type":"ContainerDied","Data":"969c747b2572a12bbb9f3852d36e59f19d952563c3924cb9adb87ac12691f447"} Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.668640 4830 scope.go:117] "RemoveContainer" containerID="2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.703584 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-dns-svc\") pod \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.703675 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-config\") pod \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.703757 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnq45\" (UniqueName: \"kubernetes.io/projected/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-kube-api-access-fnq45\") pod \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.703779 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-ovsdbserver-nb\") pod \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\" (UID: \"47c1b6a9-a242-44d3-b9e8-f02c8d925ea1\") " Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.715250 4830 scope.go:117] "RemoveContainer" containerID="c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.715547 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-kube-api-access-fnq45" (OuterVolumeSpecName: "kube-api-access-fnq45") pod "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" (UID: "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1"). InnerVolumeSpecName "kube-api-access-fnq45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:51 crc kubenswrapper[4830]: E0311 09:33:51.719896 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846\": container with ID starting with c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846 not found: ID does not exist" containerID="c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.719949 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846"} err="failed to get container status \"c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846\": rpc error: code = NotFound desc = could not find container \"c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846\": container with ID starting with c8700b1238087564bf4896a85aaf4f5ad710e1b3b4219f614fe0866f09064846 not found: ID does not exist" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.719976 4830 scope.go:117] "RemoveContainer" containerID="2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f" Mar 11 09:33:51 crc kubenswrapper[4830]: E0311 09:33:51.720680 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f\": container with ID starting with 2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f not found: ID does not exist" containerID="2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.720753 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f"} err="failed to get container status \"2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f\": rpc error: code = NotFound desc = could not find container \"2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f\": container with ID starting with 2b424804eeb2765630713315726674f13aedccb1c9a4a41b946afc38232b501f not found: ID does not exist" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.747885 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-config" (OuterVolumeSpecName: "config") pod "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" (UID: "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.754139 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" (UID: "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.791404 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" (UID: "47c1b6a9-a242-44d3-b9e8-f02c8d925ea1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.806509 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.806539 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.806551 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnq45\" (UniqueName: \"kubernetes.io/projected/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-kube-api-access-fnq45\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.806565 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.971703 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-4hkw2"] Mar 11 09:33:51 crc kubenswrapper[4830]: W0311 09:33:51.976512 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d65cbea_4a39_478f_91b1_60e6dd72b135.slice/crio-631900113de4d14ee4ac239ca5e7198b3bab88e11a11bd68539d527c81e0cebf WatchSource:0}: Error finding container 631900113de4d14ee4ac239ca5e7198b3bab88e11a11bd68539d527c81e0cebf: Status 404 returned error can't find the container with id 631900113de4d14ee4ac239ca5e7198b3bab88e11a11bd68539d527c81e0cebf Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.980601 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-j8pmz"] Mar 11 09:33:51 crc kubenswrapper[4830]: I0311 09:33:51.993791 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-j8pmz"] Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.135977 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.136624 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerName="dnsmasq-dns" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.136638 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerName="dnsmasq-dns" Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.136668 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerName="init" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.136676 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerName="init" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.136837 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" containerName="dnsmasq-dns" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.148353 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.152672 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.152775 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-75zp4" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.152874 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.153064 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.168154 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.199659 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.207653 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317049 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-operator-scripts\") pod \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317160 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ms9k\" (UniqueName: \"kubernetes.io/projected/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-kube-api-access-5ms9k\") pod \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\" (UID: \"0b71ab87-579d-41fb-b094-92cc4e3fe5b6\") " Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317232 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42ddef2-86d1-435e-b240-f159db41a2d2-operator-scripts\") pod \"a42ddef2-86d1-435e-b240-f159db41a2d2\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317266 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgcl\" (UniqueName: \"kubernetes.io/projected/a42ddef2-86d1-435e-b240-f159db41a2d2-kube-api-access-7wgcl\") pod \"a42ddef2-86d1-435e-b240-f159db41a2d2\" (UID: \"a42ddef2-86d1-435e-b240-f159db41a2d2\") " Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317610 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db4bedf3-ea20-4a63-9623-96286e9b243b-cache\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317718 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4bedf3-ea20-4a63-9623-96286e9b243b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/db4bedf3-ea20-4a63-9623-96286e9b243b-lock\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317805 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317804 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b71ab87-579d-41fb-b094-92cc4e3fe5b6" (UID: "0b71ab87-579d-41fb-b094-92cc4e3fe5b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.317905 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42ddef2-86d1-435e-b240-f159db41a2d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a42ddef2-86d1-435e-b240-f159db41a2d2" (UID: "a42ddef2-86d1-435e-b240-f159db41a2d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.318190 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.318248 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52tq7\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-kube-api-access-52tq7\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.318420 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a42ddef2-86d1-435e-b240-f159db41a2d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.318437 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.321357 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42ddef2-86d1-435e-b240-f159db41a2d2-kube-api-access-7wgcl" (OuterVolumeSpecName: "kube-api-access-7wgcl") pod "a42ddef2-86d1-435e-b240-f159db41a2d2" (UID: "a42ddef2-86d1-435e-b240-f159db41a2d2"). InnerVolumeSpecName "kube-api-access-7wgcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.321477 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-kube-api-access-5ms9k" (OuterVolumeSpecName: "kube-api-access-5ms9k") pod "0b71ab87-579d-41fb-b094-92cc4e3fe5b6" (UID: "0b71ab87-579d-41fb-b094-92cc4e3fe5b6"). InnerVolumeSpecName "kube-api-access-5ms9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419691 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52tq7\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-kube-api-access-52tq7\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419767 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db4bedf3-ea20-4a63-9623-96286e9b243b-cache\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419805 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4bedf3-ea20-4a63-9623-96286e9b243b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419822 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/db4bedf3-ea20-4a63-9623-96286e9b243b-lock\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419858 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419916 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419957 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgcl\" (UniqueName: \"kubernetes.io/projected/a42ddef2-86d1-435e-b240-f159db41a2d2-kube-api-access-7wgcl\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.419968 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ms9k\" (UniqueName: \"kubernetes.io/projected/0b71ab87-579d-41fb-b094-92cc4e3fe5b6-kube-api-access-5ms9k\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.420246 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.420320 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.420363 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.420423 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift podName:db4bedf3-ea20-4a63-9623-96286e9b243b nodeName:}" failed. No retries permitted until 2026-03-11 09:33:52.920401994 +0000 UTC m=+1200.701552763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift") pod "swift-storage-0" (UID: "db4bedf3-ea20-4a63-9623-96286e9b243b") : configmap "swift-ring-files" not found Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.420439 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db4bedf3-ea20-4a63-9623-96286e9b243b-cache\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.420492 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/db4bedf3-ea20-4a63-9623-96286e9b243b-lock\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.425239 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4bedf3-ea20-4a63-9623-96286e9b243b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.436974 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52tq7\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-kube-api-access-52tq7\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.441424 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.666427 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7nmp" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.666417 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7nmp" event={"ID":"a42ddef2-86d1-435e-b240-f159db41a2d2","Type":"ContainerDied","Data":"0e04559d21b452407c9424b56ed4e50c015989a0b0e067f322a39d4365c849b4"} Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.666568 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e04559d21b452407c9424b56ed4e50c015989a0b0e067f322a39d4365c849b4" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.668010 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c667-account-create-update-cpknz" event={"ID":"0b71ab87-579d-41fb-b094-92cc4e3fe5b6","Type":"ContainerDied","Data":"7147d0638bd9f2e83c19be46ce2248f3cc43f1a37a87a130f3a11afb0719eff3"} Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.668057 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7147d0638bd9f2e83c19be46ce2248f3cc43f1a37a87a130f3a11afb0719eff3" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.668099 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c667-account-create-update-cpknz" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.675376 4830 generic.go:334] "Generic (PLEG): container finished" podID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerID="ba4e9de0692fbd8a3d2d164ab4dde81fd85ae4991a0736e46a21f75f11d3102f" exitCode=0 Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.676348 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" event={"ID":"6d65cbea-4a39-478f-91b1-60e6dd72b135","Type":"ContainerDied","Data":"ba4e9de0692fbd8a3d2d164ab4dde81fd85ae4991a0736e46a21f75f11d3102f"} Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.676383 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" event={"ID":"6d65cbea-4a39-478f-91b1-60e6dd72b135","Type":"ContainerStarted","Data":"631900113de4d14ee4ac239ca5e7198b3bab88e11a11bd68539d527c81e0cebf"} Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.774551 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jmd8g"] Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.781415 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b71ab87-579d-41fb-b094-92cc4e3fe5b6" containerName="mariadb-account-create-update" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.781460 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b71ab87-579d-41fb-b094-92cc4e3fe5b6" containerName="mariadb-account-create-update" Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.781497 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42ddef2-86d1-435e-b240-f159db41a2d2" containerName="mariadb-database-create" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.781505 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42ddef2-86d1-435e-b240-f159db41a2d2" containerName="mariadb-database-create" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.782311 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42ddef2-86d1-435e-b240-f159db41a2d2" containerName="mariadb-database-create" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.782344 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b71ab87-579d-41fb-b094-92cc4e3fe5b6" containerName="mariadb-account-create-update" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.783609 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.797281 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jmd8g"] Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.797769 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.797911 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.798611 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.936093 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-combined-ca-bundle\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.936630 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5t95\" (UniqueName: \"kubernetes.io/projected/9e52669c-56df-4791-84e7-4d4bd34e420f-kube-api-access-f5t95\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.936666 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-swiftconf\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.936816 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e52669c-56df-4791-84e7-4d4bd34e420f-etc-swift\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.936954 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-dispersionconf\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.937030 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.937130 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-scripts\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.937170 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.937190 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.937196 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-ring-data-devices\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:52 crc kubenswrapper[4830]: E0311 09:33:52.937238 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift podName:db4bedf3-ea20-4a63-9623-96286e9b243b nodeName:}" failed. No retries permitted until 2026-03-11 09:33:53.937220445 +0000 UTC m=+1201.718371194 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift") pod "swift-storage-0" (UID: "db4bedf3-ea20-4a63-9623-96286e9b243b") : configmap "swift-ring-files" not found Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.960635 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c1b6a9-a242-44d3-b9e8-f02c8d925ea1" path="/var/lib/kubelet/pods/47c1b6a9-a242-44d3-b9e8-f02c8d925ea1/volumes" Mar 11 09:33:52 crc kubenswrapper[4830]: I0311 09:33:52.961737 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72de00a2-a84b-46ad-9830-d880367fea73" path="/var/lib/kubelet/pods/72de00a2-a84b-46ad-9830-d880367fea73/volumes" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.039330 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-dispersionconf\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.039646 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-scripts\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.039664 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-ring-data-devices\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.039774 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-combined-ca-bundle\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.039805 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5t95\" (UniqueName: \"kubernetes.io/projected/9e52669c-56df-4791-84e7-4d4bd34e420f-kube-api-access-f5t95\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.039828 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-swiftconf\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.039875 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e52669c-56df-4791-84e7-4d4bd34e420f-etc-swift\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.040284 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e52669c-56df-4791-84e7-4d4bd34e420f-etc-swift\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.051923 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-swiftconf\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.052796 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.054525 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-combined-ca-bundle\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.079328 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.079500 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.080104 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-ring-data-devices\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.082287 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-scripts\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.098559 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-dispersionconf\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.100511 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5t95\" (UniqueName: \"kubernetes.io/projected/9e52669c-56df-4791-84e7-4d4bd34e420f-kube-api-access-f5t95\") pod \"swift-ring-rebalance-jmd8g\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.224528 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-75zp4" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.225105 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.281605 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nvf88" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.351059 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8f302b2-d44c-4598-89d7-1592d4e14f6d-operator-scripts\") pod \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.352720 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxvv2\" (UniqueName: \"kubernetes.io/projected/f8f302b2-d44c-4598-89d7-1592d4e14f6d-kube-api-access-qxvv2\") pod \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\" (UID: \"f8f302b2-d44c-4598-89d7-1592d4e14f6d\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.352174 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f302b2-d44c-4598-89d7-1592d4e14f6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8f302b2-d44c-4598-89d7-1592d4e14f6d" (UID: "f8f302b2-d44c-4598-89d7-1592d4e14f6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.355992 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8f302b2-d44c-4598-89d7-1592d4e14f6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.360105 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f302b2-d44c-4598-89d7-1592d4e14f6d-kube-api-access-qxvv2" (OuterVolumeSpecName: "kube-api-access-qxvv2") pod "f8f302b2-d44c-4598-89d7-1592d4e14f6d" (UID: "f8f302b2-d44c-4598-89d7-1592d4e14f6d"). InnerVolumeSpecName "kube-api-access-qxvv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.483167 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxvv2\" (UniqueName: \"kubernetes.io/projected/f8f302b2-d44c-4598-89d7-1592d4e14f6d-kube-api-access-qxvv2\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.490441 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.506181 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.514311 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.583839 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6497d6f-b364-4d49-8825-08dd005ef1a1-operator-scripts\") pod \"d6497d6f-b364-4d49-8825-08dd005ef1a1\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.584153 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w45v\" (UniqueName: \"kubernetes.io/projected/d6497d6f-b364-4d49-8825-08dd005ef1a1-kube-api-access-7w45v\") pod \"d6497d6f-b364-4d49-8825-08dd005ef1a1\" (UID: \"d6497d6f-b364-4d49-8825-08dd005ef1a1\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.584658 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6497d6f-b364-4d49-8825-08dd005ef1a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6497d6f-b364-4d49-8825-08dd005ef1a1" (UID: "d6497d6f-b364-4d49-8825-08dd005ef1a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.589090 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6497d6f-b364-4d49-8825-08dd005ef1a1-kube-api-access-7w45v" (OuterVolumeSpecName: "kube-api-access-7w45v") pod "d6497d6f-b364-4d49-8825-08dd005ef1a1" (UID: "d6497d6f-b364-4d49-8825-08dd005ef1a1"). InnerVolumeSpecName "kube-api-access-7w45v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.685806 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-operator-scripts\") pod \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.685893 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bzmg\" (UniqueName: \"kubernetes.io/projected/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-kube-api-access-9bzmg\") pod \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\" (UID: \"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.686111 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pnhw\" (UniqueName: \"kubernetes.io/projected/76a18449-bdae-4eab-a9f8-816cf3064929-kube-api-access-4pnhw\") pod \"76a18449-bdae-4eab-a9f8-816cf3064929\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.686163 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a18449-bdae-4eab-a9f8-816cf3064929-operator-scripts\") pod \"76a18449-bdae-4eab-a9f8-816cf3064929\" (UID: \"76a18449-bdae-4eab-a9f8-816cf3064929\") " Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.686478 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b" (UID: "88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.686624 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.686637 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6497d6f-b364-4d49-8825-08dd005ef1a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.686646 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w45v\" (UniqueName: \"kubernetes.io/projected/d6497d6f-b364-4d49-8825-08dd005ef1a1-kube-api-access-7w45v\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.686979 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a18449-bdae-4eab-a9f8-816cf3064929-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76a18449-bdae-4eab-a9f8-816cf3064929" (UID: "76a18449-bdae-4eab-a9f8-816cf3064929"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.690270 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-kube-api-access-9bzmg" (OuterVolumeSpecName: "kube-api-access-9bzmg") pod "88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b" (UID: "88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b"). InnerVolumeSpecName "kube-api-access-9bzmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.691130 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a18449-bdae-4eab-a9f8-816cf3064929-kube-api-access-4pnhw" (OuterVolumeSpecName: "kube-api-access-4pnhw") pod "76a18449-bdae-4eab-a9f8-816cf3064929" (UID: "76a18449-bdae-4eab-a9f8-816cf3064929"). InnerVolumeSpecName "kube-api-access-4pnhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.697058 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mmh84" event={"ID":"76a18449-bdae-4eab-a9f8-816cf3064929","Type":"ContainerDied","Data":"15532c291e3d1ee8628b95ba4d2921c2be7212bbbd075d3ddaf7832ebb2651b2"} Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.697216 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15532c291e3d1ee8628b95ba4d2921c2be7212bbbd075d3ddaf7832ebb2651b2" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.697584 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mmh84" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.699870 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd45-account-create-update-8gxtl" event={"ID":"88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b","Type":"ContainerDied","Data":"0aefcdd235ea0c567865982b3e032e6c8d0e1029be3bcdaccec7205dfad988c1"} Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.699907 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aefcdd235ea0c567865982b3e032e6c8d0e1029be3bcdaccec7205dfad988c1" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.700430 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd45-account-create-update-8gxtl" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.704213 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" event={"ID":"6d65cbea-4a39-478f-91b1-60e6dd72b135","Type":"ContainerStarted","Data":"a6d584156d01702725959c856988ee13f058d3221156dd9a2cc836c7abf66ccf"} Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.704364 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.714667 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-125c-account-create-update-rnfqn" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.714674 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-125c-account-create-update-rnfqn" event={"ID":"d6497d6f-b364-4d49-8825-08dd005ef1a1","Type":"ContainerDied","Data":"45d448324966a8c8090615425117569a4b1df0244c37d2eff00c41c3f1d86b9d"} Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.714721 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45d448324966a8c8090615425117569a4b1df0244c37d2eff00c41c3f1d86b9d" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.719518 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-nvf88" event={"ID":"f8f302b2-d44c-4598-89d7-1592d4e14f6d","Type":"ContainerDied","Data":"a80c0d5c09c262444ba93738b38c1f69399df6d0f2968165c301b14e27145da9"} Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.719572 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a80c0d5c09c262444ba93738b38c1f69399df6d0f2968165c301b14e27145da9" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.719644 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-nvf88" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.746247 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8cx6g"] Mar 11 09:33:53 crc kubenswrapper[4830]: E0311 09:33:53.747323 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6497d6f-b364-4d49-8825-08dd005ef1a1" containerName="mariadb-account-create-update" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747347 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6497d6f-b364-4d49-8825-08dd005ef1a1" containerName="mariadb-account-create-update" Mar 11 09:33:53 crc kubenswrapper[4830]: E0311 09:33:53.747371 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f302b2-d44c-4598-89d7-1592d4e14f6d" containerName="mariadb-database-create" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747388 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f302b2-d44c-4598-89d7-1592d4e14f6d" containerName="mariadb-database-create" Mar 11 09:33:53 crc kubenswrapper[4830]: E0311 09:33:53.747405 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b" containerName="mariadb-account-create-update" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747412 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b" containerName="mariadb-account-create-update" Mar 11 09:33:53 crc kubenswrapper[4830]: E0311 09:33:53.747428 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a18449-bdae-4eab-a9f8-816cf3064929" containerName="mariadb-database-create" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747434 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a18449-bdae-4eab-a9f8-816cf3064929" containerName="mariadb-database-create" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747610 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b" containerName="mariadb-account-create-update" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747626 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6497d6f-b364-4d49-8825-08dd005ef1a1" containerName="mariadb-account-create-update" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747647 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f302b2-d44c-4598-89d7-1592d4e14f6d" containerName="mariadb-database-create" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.747662 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a18449-bdae-4eab-a9f8-816cf3064929" containerName="mariadb-database-create" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.748422 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.751577 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.751996 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vttg" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.754809 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" podStartSLOduration=3.754784189 podStartE2EDuration="3.754784189s" podCreationTimestamp="2026-03-11 09:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:33:53.741934312 +0000 UTC m=+1201.523085011" watchObservedRunningTime="2026-03-11 09:33:53.754784189 +0000 UTC m=+1201.535934888" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.769442 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8cx6g"] Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.794148 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pnhw\" (UniqueName: \"kubernetes.io/projected/76a18449-bdae-4eab-a9f8-816cf3064929-kube-api-access-4pnhw\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.794181 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76a18449-bdae-4eab-a9f8-816cf3064929-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.794195 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bzmg\" (UniqueName: \"kubernetes.io/projected/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b-kube-api-access-9bzmg\") on node \"crc\" DevicePath \"\"" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.837840 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jmd8g"] Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.895315 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-config-data\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.895361 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-db-sync-config-data\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.895391 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-combined-ca-bundle\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.895649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp7tg\" (UniqueName: \"kubernetes.io/projected/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-kube-api-access-tp7tg\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.996846 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-config-data\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.996900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-db-sync-config-data\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.996933 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-combined-ca-bundle\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.997245 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:53 crc kubenswrapper[4830]: E0311 09:33:53.997417 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 09:33:53 crc kubenswrapper[4830]: E0311 09:33:53.997438 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 09:33:53 crc kubenswrapper[4830]: E0311 09:33:53.997483 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift podName:db4bedf3-ea20-4a63-9623-96286e9b243b nodeName:}" failed. No retries permitted until 2026-03-11 09:33:55.997469643 +0000 UTC m=+1203.778620332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift") pod "swift-storage-0" (UID: "db4bedf3-ea20-4a63-9623-96286e9b243b") : configmap "swift-ring-files" not found Mar 11 09:33:53 crc kubenswrapper[4830]: I0311 09:33:53.997413 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp7tg\" (UniqueName: \"kubernetes.io/projected/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-kube-api-access-tp7tg\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.001806 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-combined-ca-bundle\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.002273 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-db-sync-config-data\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.003311 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-config-data\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.021102 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp7tg\" (UniqueName: \"kubernetes.io/projected/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-kube-api-access-tp7tg\") pod \"glance-db-sync-8cx6g\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.074841 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8cx6g" Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.621986 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8cx6g"] Mar 11 09:33:54 crc kubenswrapper[4830]: W0311 09:33:54.625324 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb917ac6a_dcd3_46e7_b4a5_65e7a5622959.slice/crio-3a133d0155668adf49798b06d8666e16c6e46e7ba487e407ca55ee45dbaaec97 WatchSource:0}: Error finding container 3a133d0155668adf49798b06d8666e16c6e46e7ba487e407ca55ee45dbaaec97: Status 404 returned error can't find the container with id 3a133d0155668adf49798b06d8666e16c6e46e7ba487e407ca55ee45dbaaec97 Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.731916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jmd8g" event={"ID":"9e52669c-56df-4791-84e7-4d4bd34e420f","Type":"ContainerStarted","Data":"125ff11228c793aa8c4ceee859b4a0837f1aad64ebda622b1689d4eef0f1c8e1"} Mar 11 09:33:54 crc kubenswrapper[4830]: I0311 09:33:54.740885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8cx6g" event={"ID":"b917ac6a-dcd3-46e7-b4a5-65e7a5622959","Type":"ContainerStarted","Data":"3a133d0155668adf49798b06d8666e16c6e46e7ba487e407ca55ee45dbaaec97"} Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.031269 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:33:56 crc kubenswrapper[4830]: E0311 09:33:56.031476 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 09:33:56 crc kubenswrapper[4830]: E0311 09:33:56.031604 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 09:33:56 crc kubenswrapper[4830]: E0311 09:33:56.031664 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift podName:db4bedf3-ea20-4a63-9623-96286e9b243b nodeName:}" failed. No retries permitted until 2026-03-11 09:34:00.031643398 +0000 UTC m=+1207.812794167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift") pod "swift-storage-0" (UID: "db4bedf3-ea20-4a63-9623-96286e9b243b") : configmap "swift-ring-files" not found Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.449445 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-x6g8g"] Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.450689 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.453728 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.481108 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x6g8g"] Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.643255 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-operator-scripts\") pod \"root-account-create-update-x6g8g\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.643727 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkc99\" (UniqueName: \"kubernetes.io/projected/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-kube-api-access-jkc99\") pod \"root-account-create-update-x6g8g\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.744928 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkc99\" (UniqueName: \"kubernetes.io/projected/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-kube-api-access-jkc99\") pod \"root-account-create-update-x6g8g\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.745077 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-operator-scripts\") pod \"root-account-create-update-x6g8g\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.745810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-operator-scripts\") pod \"root-account-create-update-x6g8g\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:56 crc kubenswrapper[4830]: I0311 09:33:56.768513 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkc99\" (UniqueName: \"kubernetes.io/projected/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-kube-api-access-jkc99\") pod \"root-account-create-update-x6g8g\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:57 crc kubenswrapper[4830]: I0311 09:33:57.067721 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6g8g" Mar 11 09:33:58 crc kubenswrapper[4830]: I0311 09:33:58.775658 4830 generic.go:334] "Generic (PLEG): container finished" podID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerID="d0f891f9ee5111ec4ec41fb1f690e427848198d851ed49b68aea9751db762add" exitCode=0 Mar 11 09:33:58 crc kubenswrapper[4830]: I0311 09:33:58.775778 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3","Type":"ContainerDied","Data":"d0f891f9ee5111ec4ec41fb1f690e427848198d851ed49b68aea9751db762add"} Mar 11 09:33:58 crc kubenswrapper[4830]: I0311 09:33:58.778549 4830 generic.go:334] "Generic (PLEG): container finished" podID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerID="14d9a01991262c020d733b7284b2b073581e01cc8c4c54b5086cc048dede37f5" exitCode=0 Mar 11 09:33:58 crc kubenswrapper[4830]: I0311 09:33:58.778578 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75e77e40-6cb5-47ec-9074-b663b7dba6b4","Type":"ContainerDied","Data":"14d9a01991262c020d733b7284b2b073581e01cc8c4c54b5086cc048dede37f5"} Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.028549 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x6g8g"] Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.099941 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:34:00 crc kubenswrapper[4830]: E0311 09:34:00.100171 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 09:34:00 crc kubenswrapper[4830]: E0311 09:34:00.100208 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 09:34:00 crc kubenswrapper[4830]: E0311 09:34:00.100264 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift podName:db4bedf3-ea20-4a63-9623-96286e9b243b nodeName:}" failed. No retries permitted until 2026-03-11 09:34:08.100246554 +0000 UTC m=+1215.881397303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift") pod "swift-storage-0" (UID: "db4bedf3-ea20-4a63-9623-96286e9b243b") : configmap "swift-ring-files" not found Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.139075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553694-f9xwg"] Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.140201 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.142963 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.143215 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.143833 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.152813 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553694-f9xwg"] Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.303486 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvnw\" (UniqueName: \"kubernetes.io/projected/74000d07-4644-41b1-90d3-0de67ed840c7-kube-api-access-fgvnw\") pod \"auto-csr-approver-29553694-f9xwg\" (UID: \"74000d07-4644-41b1-90d3-0de67ed840c7\") " pod="openshift-infra/auto-csr-approver-29553694-f9xwg" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.405738 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvnw\" (UniqueName: \"kubernetes.io/projected/74000d07-4644-41b1-90d3-0de67ed840c7-kube-api-access-fgvnw\") pod \"auto-csr-approver-29553694-f9xwg\" (UID: \"74000d07-4644-41b1-90d3-0de67ed840c7\") " pod="openshift-infra/auto-csr-approver-29553694-f9xwg" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.425826 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvnw\" (UniqueName: \"kubernetes.io/projected/74000d07-4644-41b1-90d3-0de67ed840c7-kube-api-access-fgvnw\") pod \"auto-csr-approver-29553694-f9xwg\" (UID: \"74000d07-4644-41b1-90d3-0de67ed840c7\") " pod="openshift-infra/auto-csr-approver-29553694-f9xwg" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.456844 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.802518 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jmd8g" event={"ID":"9e52669c-56df-4791-84e7-4d4bd34e420f","Type":"ContainerStarted","Data":"4adf5722b568cb5fe2e2d9a06d346dec42ec82faab58634a1b9dd676cb4b91ce"} Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.806290 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3","Type":"ContainerStarted","Data":"6f1470d751e93a88f827c3665ad0c284f3ff260cab385acdd59b329e7a850524"} Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.806599 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.808737 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75e77e40-6cb5-47ec-9074-b663b7dba6b4","Type":"ContainerStarted","Data":"3220b80cf459f53356fc59464b56078b0b7074afee6c4ffef2dc9092091aa52c"} Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.809310 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.812834 4830 generic.go:334] "Generic (PLEG): container finished" podID="bed47348-1ba4-41ad-a417-7b9c55bdb1f2" containerID="c513882a6021e53a245d11059a47ae8bee641309fcbd1737a370fccbcef71d3d" exitCode=0 Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.812869 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x6g8g" event={"ID":"bed47348-1ba4-41ad-a417-7b9c55bdb1f2","Type":"ContainerDied","Data":"c513882a6021e53a245d11059a47ae8bee641309fcbd1737a370fccbcef71d3d"} Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.812895 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x6g8g" event={"ID":"bed47348-1ba4-41ad-a417-7b9c55bdb1f2","Type":"ContainerStarted","Data":"40af7854b66eb5b35b3a98de0fe64f8706a524d545bbf63f06adf6b01434f098"} Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.820517 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jmd8g" podStartSLOduration=2.968849729 podStartE2EDuration="8.820302006s" podCreationTimestamp="2026-03-11 09:33:52 +0000 UTC" firstStartedPulling="2026-03-11 09:33:53.845396969 +0000 UTC m=+1201.626547658" lastFinishedPulling="2026-03-11 09:33:59.696849246 +0000 UTC m=+1207.477999935" observedRunningTime="2026-03-11 09:34:00.82007823 +0000 UTC m=+1208.601228929" watchObservedRunningTime="2026-03-11 09:34:00.820302006 +0000 UTC m=+1208.601452705" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.869511 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.60225782 podStartE2EDuration="1m5.869491509s" podCreationTimestamp="2026-03-11 09:32:55 +0000 UTC" firstStartedPulling="2026-03-11 09:32:57.92122647 +0000 UTC m=+1145.702377159" lastFinishedPulling="2026-03-11 09:33:25.188460159 +0000 UTC m=+1172.969610848" observedRunningTime="2026-03-11 09:34:00.863288057 +0000 UTC m=+1208.644438766" watchObservedRunningTime="2026-03-11 09:34:00.869491509 +0000 UTC m=+1208.650642208" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.896311 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.054401066 podStartE2EDuration="1m5.896292261s" podCreationTimestamp="2026-03-11 09:32:55 +0000 UTC" firstStartedPulling="2026-03-11 09:32:57.369335315 +0000 UTC m=+1145.150486004" lastFinishedPulling="2026-03-11 09:33:25.21122651 +0000 UTC m=+1172.992377199" observedRunningTime="2026-03-11 09:34:00.892670181 +0000 UTC m=+1208.673820890" watchObservedRunningTime="2026-03-11 09:34:00.896292261 +0000 UTC m=+1208.677442940" Mar 11 09:34:00 crc kubenswrapper[4830]: I0311 09:34:00.935654 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553694-f9xwg"] Mar 11 09:34:01 crc kubenswrapper[4830]: I0311 09:34:01.460180 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:34:01 crc kubenswrapper[4830]: I0311 09:34:01.525701 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mkvnn"] Mar 11 09:34:01 crc kubenswrapper[4830]: I0311 09:34:01.525987 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-mkvnn" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerName="dnsmasq-dns" containerID="cri-o://4d82246f8c771b8d1aa4957b6eedca363fe47c836b57dbd8c5d503d49ccb1070" gracePeriod=10 Mar 11 09:34:01 crc kubenswrapper[4830]: I0311 09:34:01.841579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" event={"ID":"74000d07-4644-41b1-90d3-0de67ed840c7","Type":"ContainerStarted","Data":"307fad9dfe0eeb3df150c5d043eba466c4924cb64d3589a3159b657ec8f0df16"} Mar 11 09:34:01 crc kubenswrapper[4830]: I0311 09:34:01.853466 4830 generic.go:334] "Generic (PLEG): container finished" podID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerID="4d82246f8c771b8d1aa4957b6eedca363fe47c836b57dbd8c5d503d49ccb1070" exitCode=0 Mar 11 09:34:01 crc kubenswrapper[4830]: I0311 09:34:01.853538 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mkvnn" event={"ID":"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a","Type":"ContainerDied","Data":"4d82246f8c771b8d1aa4957b6eedca363fe47c836b57dbd8c5d503d49ccb1070"} Mar 11 09:34:02 crc kubenswrapper[4830]: I0311 09:34:02.195482 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 11 09:34:06 crc kubenswrapper[4830]: I0311 09:34:06.458113 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-mkvnn" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.168220 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:34:08 crc kubenswrapper[4830]: E0311 09:34:08.168389 4830 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 11 09:34:08 crc kubenswrapper[4830]: E0311 09:34:08.168414 4830 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 11 09:34:08 crc kubenswrapper[4830]: E0311 09:34:08.168475 4830 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift podName:db4bedf3-ea20-4a63-9623-96286e9b243b nodeName:}" failed. No retries permitted until 2026-03-11 09:34:24.168458904 +0000 UTC m=+1231.949609593 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift") pod "swift-storage-0" (UID: "db4bedf3-ea20-4a63-9623-96286e9b243b") : configmap "swift-ring-files" not found Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.605671 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xjsks" podUID="d97948cc-fc42-46c8-b46e-3f8efdc251db" containerName="ovn-controller" probeResult="failure" output=< Mar 11 09:34:08 crc kubenswrapper[4830]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 11 09:34:08 crc kubenswrapper[4830]: > Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.641765 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.649763 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-klr5s" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.879507 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xjsks-config-d6ccd"] Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.880514 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.884311 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.896828 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjsks-config-d6ccd"] Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.985097 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwl5\" (UniqueName: \"kubernetes.io/projected/0042d324-98e3-4464-afab-fbc5cff018dc-kube-api-access-xdwl5\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.985146 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-additional-scripts\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.985178 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run-ovn\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.985446 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-log-ovn\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.985507 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-scripts\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:08 crc kubenswrapper[4830]: I0311 09:34:08.985563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087225 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwl5\" (UniqueName: \"kubernetes.io/projected/0042d324-98e3-4464-afab-fbc5cff018dc-kube-api-access-xdwl5\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-additional-scripts\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087305 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run-ovn\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087370 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-log-ovn\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087390 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-scripts\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087411 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087676 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087723 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-log-ovn\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.087741 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run-ovn\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.089796 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-scripts\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.090277 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-additional-scripts\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.110603 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwl5\" (UniqueName: \"kubernetes.io/projected/0042d324-98e3-4464-afab-fbc5cff018dc-kube-api-access-xdwl5\") pod \"ovn-controller-xjsks-config-d6ccd\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:09 crc kubenswrapper[4830]: I0311 09:34:09.199595 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.198134 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6g8g" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.306779 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-operator-scripts\") pod \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.308271 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkc99\" (UniqueName: \"kubernetes.io/projected/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-kube-api-access-jkc99\") pod \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\" (UID: \"bed47348-1ba4-41ad-a417-7b9c55bdb1f2\") " Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.311301 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bed47348-1ba4-41ad-a417-7b9c55bdb1f2" (UID: "bed47348-1ba4-41ad-a417-7b9c55bdb1f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.318784 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-kube-api-access-jkc99" (OuterVolumeSpecName: "kube-api-access-jkc99") pod "bed47348-1ba4-41ad-a417-7b9c55bdb1f2" (UID: "bed47348-1ba4-41ad-a417-7b9c55bdb1f2"). InnerVolumeSpecName "kube-api-access-jkc99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.412488 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.412530 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkc99\" (UniqueName: \"kubernetes.io/projected/bed47348-1ba4-41ad-a417-7b9c55bdb1f2-kube-api-access-jkc99\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.427344 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.513158 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-nb\") pod \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.513547 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-sb\") pod \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.513649 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdsvj\" (UniqueName: \"kubernetes.io/projected/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-kube-api-access-kdsvj\") pod \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.513743 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-config\") pod \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.513813 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-dns-svc\") pod \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\" (UID: \"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a\") " Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.535056 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-kube-api-access-kdsvj" (OuterVolumeSpecName: "kube-api-access-kdsvj") pod "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" (UID: "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a"). InnerVolumeSpecName "kube-api-access-kdsvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.570551 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" (UID: "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.581395 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-config" (OuterVolumeSpecName: "config") pod "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" (UID: "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.581420 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" (UID: "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.597332 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" (UID: "ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.615272 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.615298 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.615308 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdsvj\" (UniqueName: \"kubernetes.io/projected/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-kube-api-access-kdsvj\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.615318 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.615327 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.691443 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xjsks-config-d6ccd"] Mar 11 09:34:10 crc kubenswrapper[4830]: W0311 09:34:10.695325 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0042d324_98e3_4464_afab_fbc5cff018dc.slice/crio-ca6a12a39965a7b83313a46ca499d76ebd136da6ccdfd4c66de7ac38f0a7876e WatchSource:0}: Error finding container ca6a12a39965a7b83313a46ca499d76ebd136da6ccdfd4c66de7ac38f0a7876e: Status 404 returned error can't find the container with id ca6a12a39965a7b83313a46ca499d76ebd136da6ccdfd4c66de7ac38f0a7876e Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.933955 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x6g8g" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.936025 4830 generic.go:334] "Generic (PLEG): container finished" podID="9e52669c-56df-4791-84e7-4d4bd34e420f" containerID="4adf5722b568cb5fe2e2d9a06d346dec42ec82faab58634a1b9dd676cb4b91ce" exitCode=0 Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.939871 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mkvnn" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960321 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" podStartSLOduration=1.71475466 podStartE2EDuration="10.960299193s" podCreationTimestamp="2026-03-11 09:34:00 +0000 UTC" firstStartedPulling="2026-03-11 09:34:00.957657682 +0000 UTC m=+1208.738808371" lastFinishedPulling="2026-03-11 09:34:10.203202215 +0000 UTC m=+1217.984352904" observedRunningTime="2026-03-11 09:34:10.950304656 +0000 UTC m=+1218.731455355" watchObservedRunningTime="2026-03-11 09:34:10.960299193 +0000 UTC m=+1218.741449882" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960343 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x6g8g" event={"ID":"bed47348-1ba4-41ad-a417-7b9c55bdb1f2","Type":"ContainerDied","Data":"40af7854b66eb5b35b3a98de0fe64f8706a524d545bbf63f06adf6b01434f098"} Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960479 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40af7854b66eb5b35b3a98de0fe64f8706a524d545bbf63f06adf6b01434f098" Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960491 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jmd8g" event={"ID":"9e52669c-56df-4791-84e7-4d4bd34e420f","Type":"ContainerDied","Data":"4adf5722b568cb5fe2e2d9a06d346dec42ec82faab58634a1b9dd676cb4b91ce"} Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960507 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" event={"ID":"74000d07-4644-41b1-90d3-0de67ed840c7","Type":"ContainerStarted","Data":"b56d6f54832da7b7c2a6d9e1058daf4c7149770f50afb7cc2145d21ed8dc4dcc"} Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960531 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mkvnn" event={"ID":"ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a","Type":"ContainerDied","Data":"618ce78324a8af33895588acaf10d15bbc911420aae3b44fe7ee4dda42ae0afe"} Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960545 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjsks-config-d6ccd" event={"ID":"0042d324-98e3-4464-afab-fbc5cff018dc","Type":"ContainerStarted","Data":"ca6a12a39965a7b83313a46ca499d76ebd136da6ccdfd4c66de7ac38f0a7876e"} Mar 11 09:34:10 crc kubenswrapper[4830]: I0311 09:34:10.960563 4830 scope.go:117] "RemoveContainer" containerID="4d82246f8c771b8d1aa4957b6eedca363fe47c836b57dbd8c5d503d49ccb1070" Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.015854 4830 scope.go:117] "RemoveContainer" containerID="ca30c7c2289642cbc846c4646c59f7b1e517d9de4b346c451bb4e5c0b50e8841" Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.040722 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mkvnn"] Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.048784 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mkvnn"] Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.952380 4830 generic.go:334] "Generic (PLEG): container finished" podID="74000d07-4644-41b1-90d3-0de67ed840c7" containerID="b56d6f54832da7b7c2a6d9e1058daf4c7149770f50afb7cc2145d21ed8dc4dcc" exitCode=0 Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.952450 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" event={"ID":"74000d07-4644-41b1-90d3-0de67ed840c7","Type":"ContainerDied","Data":"b56d6f54832da7b7c2a6d9e1058daf4c7149770f50afb7cc2145d21ed8dc4dcc"} Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.953954 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8cx6g" event={"ID":"b917ac6a-dcd3-46e7-b4a5-65e7a5622959","Type":"ContainerStarted","Data":"742a45d9ce3426284be6aff9e9444f6ecbccc966ce3e49259d63587c7653e445"} Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.956550 4830 generic.go:334] "Generic (PLEG): container finished" podID="0042d324-98e3-4464-afab-fbc5cff018dc" containerID="8a0c2f44d5c787fc5a02e301efe54cf4145e529ba5c3b55b9e617dbf9b7b8003" exitCode=0 Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.956642 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjsks-config-d6ccd" event={"ID":"0042d324-98e3-4464-afab-fbc5cff018dc","Type":"ContainerDied","Data":"8a0c2f44d5c787fc5a02e301efe54cf4145e529ba5c3b55b9e617dbf9b7b8003"} Mar 11 09:34:11 crc kubenswrapper[4830]: I0311 09:34:11.990106 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8cx6g" podStartSLOduration=3.414926698 podStartE2EDuration="18.990084647s" podCreationTimestamp="2026-03-11 09:33:53 +0000 UTC" firstStartedPulling="2026-03-11 09:33:54.628097617 +0000 UTC m=+1202.409248306" lastFinishedPulling="2026-03-11 09:34:10.203255566 +0000 UTC m=+1217.984406255" observedRunningTime="2026-03-11 09:34:11.987721802 +0000 UTC m=+1219.768872521" watchObservedRunningTime="2026-03-11 09:34:11.990084647 +0000 UTC m=+1219.771235336" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.346407 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.448978 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-ring-data-devices\") pod \"9e52669c-56df-4791-84e7-4d4bd34e420f\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.449321 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-dispersionconf\") pod \"9e52669c-56df-4791-84e7-4d4bd34e420f\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.449377 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-combined-ca-bundle\") pod \"9e52669c-56df-4791-84e7-4d4bd34e420f\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.449416 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e52669c-56df-4791-84e7-4d4bd34e420f-etc-swift\") pod \"9e52669c-56df-4791-84e7-4d4bd34e420f\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.449435 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5t95\" (UniqueName: \"kubernetes.io/projected/9e52669c-56df-4791-84e7-4d4bd34e420f-kube-api-access-f5t95\") pod \"9e52669c-56df-4791-84e7-4d4bd34e420f\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.449512 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-swiftconf\") pod \"9e52669c-56df-4791-84e7-4d4bd34e420f\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.449560 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-scripts\") pod \"9e52669c-56df-4791-84e7-4d4bd34e420f\" (UID: \"9e52669c-56df-4791-84e7-4d4bd34e420f\") " Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.450690 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9e52669c-56df-4791-84e7-4d4bd34e420f" (UID: "9e52669c-56df-4791-84e7-4d4bd34e420f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.451964 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e52669c-56df-4791-84e7-4d4bd34e420f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9e52669c-56df-4791-84e7-4d4bd34e420f" (UID: "9e52669c-56df-4791-84e7-4d4bd34e420f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.456132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e52669c-56df-4791-84e7-4d4bd34e420f-kube-api-access-f5t95" (OuterVolumeSpecName: "kube-api-access-f5t95") pod "9e52669c-56df-4791-84e7-4d4bd34e420f" (UID: "9e52669c-56df-4791-84e7-4d4bd34e420f"). InnerVolumeSpecName "kube-api-access-f5t95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.462694 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9e52669c-56df-4791-84e7-4d4bd34e420f" (UID: "9e52669c-56df-4791-84e7-4d4bd34e420f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.474246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9e52669c-56df-4791-84e7-4d4bd34e420f" (UID: "9e52669c-56df-4791-84e7-4d4bd34e420f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.475297 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-scripts" (OuterVolumeSpecName: "scripts") pod "9e52669c-56df-4791-84e7-4d4bd34e420f" (UID: "9e52669c-56df-4791-84e7-4d4bd34e420f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.491071 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e52669c-56df-4791-84e7-4d4bd34e420f" (UID: "9e52669c-56df-4791-84e7-4d4bd34e420f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.552017 4830 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.552268 4830 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.552364 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.552427 4830 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9e52669c-56df-4791-84e7-4d4bd34e420f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.552484 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5t95\" (UniqueName: \"kubernetes.io/projected/9e52669c-56df-4791-84e7-4d4bd34e420f-kube-api-access-f5t95\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.552544 4830 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9e52669c-56df-4791-84e7-4d4bd34e420f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.552605 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e52669c-56df-4791-84e7-4d4bd34e420f-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.943305 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" path="/var/lib/kubelet/pods/ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a/volumes" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.964024 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jmd8g" event={"ID":"9e52669c-56df-4791-84e7-4d4bd34e420f","Type":"ContainerDied","Data":"125ff11228c793aa8c4ceee859b4a0837f1aad64ebda622b1689d4eef0f1c8e1"} Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.964078 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125ff11228c793aa8c4ceee859b4a0837f1aad64ebda622b1689d4eef0f1c8e1" Mar 11 09:34:12 crc kubenswrapper[4830]: I0311 09:34:12.964328 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jmd8g" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.060420 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.060476 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.060532 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.061184 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e06cb787da7fb0f6798e1465b6764e43fa4f19a8709a93ec236e4a0b85a72f7c"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.061235 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://e06cb787da7fb0f6798e1465b6764e43fa4f19a8709a93ec236e4a0b85a72f7c" gracePeriod=600 Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.439313 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.446193 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.468819 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run-ovn\") pod \"0042d324-98e3-4464-afab-fbc5cff018dc\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.468876 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-scripts\") pod \"0042d324-98e3-4464-afab-fbc5cff018dc\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.468898 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-log-ovn\") pod \"0042d324-98e3-4464-afab-fbc5cff018dc\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.468984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0042d324-98e3-4464-afab-fbc5cff018dc" (UID: "0042d324-98e3-4464-afab-fbc5cff018dc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.469035 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-additional-scripts\") pod \"0042d324-98e3-4464-afab-fbc5cff018dc\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.469052 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0042d324-98e3-4464-afab-fbc5cff018dc" (UID: "0042d324-98e3-4464-afab-fbc5cff018dc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.469095 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdwl5\" (UniqueName: \"kubernetes.io/projected/0042d324-98e3-4464-afab-fbc5cff018dc-kube-api-access-xdwl5\") pod \"0042d324-98e3-4464-afab-fbc5cff018dc\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.469166 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run\") pod \"0042d324-98e3-4464-afab-fbc5cff018dc\" (UID: \"0042d324-98e3-4464-afab-fbc5cff018dc\") " Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.469528 4830 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.469553 4830 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.469594 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run" (OuterVolumeSpecName: "var-run") pod "0042d324-98e3-4464-afab-fbc5cff018dc" (UID: "0042d324-98e3-4464-afab-fbc5cff018dc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.470219 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0042d324-98e3-4464-afab-fbc5cff018dc" (UID: "0042d324-98e3-4464-afab-fbc5cff018dc"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.470739 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-scripts" (OuterVolumeSpecName: "scripts") pod "0042d324-98e3-4464-afab-fbc5cff018dc" (UID: "0042d324-98e3-4464-afab-fbc5cff018dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.475200 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0042d324-98e3-4464-afab-fbc5cff018dc-kube-api-access-xdwl5" (OuterVolumeSpecName: "kube-api-access-xdwl5") pod "0042d324-98e3-4464-afab-fbc5cff018dc" (UID: "0042d324-98e3-4464-afab-fbc5cff018dc"). InnerVolumeSpecName "kube-api-access-xdwl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.571666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgvnw\" (UniqueName: \"kubernetes.io/projected/74000d07-4644-41b1-90d3-0de67ed840c7-kube-api-access-fgvnw\") pod \"74000d07-4644-41b1-90d3-0de67ed840c7\" (UID: \"74000d07-4644-41b1-90d3-0de67ed840c7\") " Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.572164 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.572182 4830 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0042d324-98e3-4464-afab-fbc5cff018dc-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.572192 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdwl5\" (UniqueName: \"kubernetes.io/projected/0042d324-98e3-4464-afab-fbc5cff018dc-kube-api-access-xdwl5\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.572201 4830 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0042d324-98e3-4464-afab-fbc5cff018dc-var-run\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.592229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74000d07-4644-41b1-90d3-0de67ed840c7-kube-api-access-fgvnw" (OuterVolumeSpecName: "kube-api-access-fgvnw") pod "74000d07-4644-41b1-90d3-0de67ed840c7" (UID: "74000d07-4644-41b1-90d3-0de67ed840c7"). InnerVolumeSpecName "kube-api-access-fgvnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.673439 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgvnw\" (UniqueName: \"kubernetes.io/projected/74000d07-4644-41b1-90d3-0de67ed840c7-kube-api-access-fgvnw\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.685631 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xjsks" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.977361 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="e06cb787da7fb0f6798e1465b6764e43fa4f19a8709a93ec236e4a0b85a72f7c" exitCode=0 Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.977400 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"e06cb787da7fb0f6798e1465b6764e43fa4f19a8709a93ec236e4a0b85a72f7c"} Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.977449 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"6d51cf8acf1c408e7829c31a89fc6bf74196f438e8b371de9aaedaab30e9cfc5"} Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.977469 4830 scope.go:117] "RemoveContainer" containerID="490e43e253d22e49dfc5c2a704ffdefb34fe709ef23f6e9173eecf22518d399e" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.982411 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xjsks-config-d6ccd" event={"ID":"0042d324-98e3-4464-afab-fbc5cff018dc","Type":"ContainerDied","Data":"ca6a12a39965a7b83313a46ca499d76ebd136da6ccdfd4c66de7ac38f0a7876e"} Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.982466 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca6a12a39965a7b83313a46ca499d76ebd136da6ccdfd4c66de7ac38f0a7876e" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.982430 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xjsks-config-d6ccd" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.988339 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" event={"ID":"74000d07-4644-41b1-90d3-0de67ed840c7","Type":"ContainerDied","Data":"307fad9dfe0eeb3df150c5d043eba466c4924cb64d3589a3159b657ec8f0df16"} Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.988390 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="307fad9dfe0eeb3df150c5d043eba466c4924cb64d3589a3159b657ec8f0df16" Mar 11 09:34:13 crc kubenswrapper[4830]: I0311 09:34:13.988461 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553694-f9xwg" Mar 11 09:34:14 crc kubenswrapper[4830]: I0311 09:34:14.031165 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553688-6lxbc"] Mar 11 09:34:14 crc kubenswrapper[4830]: I0311 09:34:14.044168 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553688-6lxbc"] Mar 11 09:34:14 crc kubenswrapper[4830]: I0311 09:34:14.545731 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xjsks-config-d6ccd"] Mar 11 09:34:14 crc kubenswrapper[4830]: I0311 09:34:14.553607 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xjsks-config-d6ccd"] Mar 11 09:34:14 crc kubenswrapper[4830]: I0311 09:34:14.942331 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0042d324-98e3-4464-afab-fbc5cff018dc" path="/var/lib/kubelet/pods/0042d324-98e3-4464-afab-fbc5cff018dc/volumes" Mar 11 09:34:14 crc kubenswrapper[4830]: I0311 09:34:14.943098 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f87712-c9d4-4576-b590-95f554829b4a" path="/var/lib/kubelet/pods/e1f87712-c9d4-4576-b590-95f554829b4a/volumes" Mar 11 09:34:16 crc kubenswrapper[4830]: I0311 09:34:16.878278 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:34:17 crc kubenswrapper[4830]: I0311 09:34:17.251308 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.043124 4830 generic.go:334] "Generic (PLEG): container finished" podID="b917ac6a-dcd3-46e7-b4a5-65e7a5622959" containerID="742a45d9ce3426284be6aff9e9444f6ecbccc966ce3e49259d63587c7653e445" exitCode=0 Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.043165 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8cx6g" event={"ID":"b917ac6a-dcd3-46e7-b4a5-65e7a5622959","Type":"ContainerDied","Data":"742a45d9ce3426284be6aff9e9444f6ecbccc966ce3e49259d63587c7653e445"} Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.771710 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cx6qq"] Mar 11 09:34:18 crc kubenswrapper[4830]: E0311 09:34:18.772381 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerName="init" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772395 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerName="init" Mar 11 09:34:18 crc kubenswrapper[4830]: E0311 09:34:18.772406 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e52669c-56df-4791-84e7-4d4bd34e420f" containerName="swift-ring-rebalance" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772412 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e52669c-56df-4791-84e7-4d4bd34e420f" containerName="swift-ring-rebalance" Mar 11 09:34:18 crc kubenswrapper[4830]: E0311 09:34:18.772424 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0042d324-98e3-4464-afab-fbc5cff018dc" containerName="ovn-config" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772434 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0042d324-98e3-4464-afab-fbc5cff018dc" containerName="ovn-config" Mar 11 09:34:18 crc kubenswrapper[4830]: E0311 09:34:18.772462 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed47348-1ba4-41ad-a417-7b9c55bdb1f2" containerName="mariadb-account-create-update" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772468 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed47348-1ba4-41ad-a417-7b9c55bdb1f2" containerName="mariadb-account-create-update" Mar 11 09:34:18 crc kubenswrapper[4830]: E0311 09:34:18.772476 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74000d07-4644-41b1-90d3-0de67ed840c7" containerName="oc" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772482 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="74000d07-4644-41b1-90d3-0de67ed840c7" containerName="oc" Mar 11 09:34:18 crc kubenswrapper[4830]: E0311 09:34:18.772500 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerName="dnsmasq-dns" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772505 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerName="dnsmasq-dns" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772644 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5c494a-d5f6-4baa-8ed1-c8c4711bb36a" containerName="dnsmasq-dns" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772656 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e52669c-56df-4791-84e7-4d4bd34e420f" containerName="swift-ring-rebalance" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772667 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0042d324-98e3-4464-afab-fbc5cff018dc" containerName="ovn-config" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772676 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed47348-1ba4-41ad-a417-7b9c55bdb1f2" containerName="mariadb-account-create-update" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.772686 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="74000d07-4644-41b1-90d3-0de67ed840c7" containerName="oc" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.773214 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.784993 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cx6qq"] Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.864547 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-sl9ll"] Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.865846 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.872271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b4sp\" (UniqueName: \"kubernetes.io/projected/7808b223-41ad-41c0-96f6-d7434ce65017-kube-api-access-6b4sp\") pod \"cinder-db-create-cx6qq\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.872342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7808b223-41ad-41c0-96f6-d7434ce65017-operator-scripts\") pod \"cinder-db-create-cx6qq\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.880163 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sl9ll"] Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.905542 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-aea4-account-create-update-jvpsd"] Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.907080 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.911325 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.930925 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aea4-account-create-update-jvpsd"] Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.973942 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7808b223-41ad-41c0-96f6-d7434ce65017-operator-scripts\") pod \"cinder-db-create-cx6qq\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.974089 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pn4v\" (UniqueName: \"kubernetes.io/projected/084f4a11-5159-49ca-b836-125e788f09e4-kube-api-access-6pn4v\") pod \"barbican-db-create-sl9ll\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.974137 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084f4a11-5159-49ca-b836-125e788f09e4-operator-scripts\") pod \"barbican-db-create-sl9ll\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.974212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b4sp\" (UniqueName: \"kubernetes.io/projected/7808b223-41ad-41c0-96f6-d7434ce65017-kube-api-access-6b4sp\") pod \"cinder-db-create-cx6qq\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.974235 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-operator-scripts\") pod \"barbican-aea4-account-create-update-jvpsd\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.974258 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtkq\" (UniqueName: \"kubernetes.io/projected/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-kube-api-access-pmtkq\") pod \"barbican-aea4-account-create-update-jvpsd\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.974828 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7808b223-41ad-41c0-96f6-d7434ce65017-operator-scripts\") pod \"cinder-db-create-cx6qq\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:18 crc kubenswrapper[4830]: I0311 09:34:18.997939 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b4sp\" (UniqueName: \"kubernetes.io/projected/7808b223-41ad-41c0-96f6-d7434ce65017-kube-api-access-6b4sp\") pod \"cinder-db-create-cx6qq\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.038805 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-tqb6c"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.040057 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.043260 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.043296 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.043454 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.047943 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q9l9c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.057115 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tqb6c"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.076549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084f4a11-5159-49ca-b836-125e788f09e4-operator-scripts\") pod \"barbican-db-create-sl9ll\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.076636 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-operator-scripts\") pod \"barbican-aea4-account-create-update-jvpsd\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.076662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtkq\" (UniqueName: \"kubernetes.io/projected/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-kube-api-access-pmtkq\") pod \"barbican-aea4-account-create-update-jvpsd\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.076701 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-227wj\" (UniqueName: \"kubernetes.io/projected/d8c5179b-9c10-4987-88cf-ba72ee746480-kube-api-access-227wj\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.076818 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-combined-ca-bundle\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.076883 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pn4v\" (UniqueName: \"kubernetes.io/projected/084f4a11-5159-49ca-b836-125e788f09e4-kube-api-access-6pn4v\") pod \"barbican-db-create-sl9ll\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.076920 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-config-data\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.077797 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084f4a11-5159-49ca-b836-125e788f09e4-operator-scripts\") pod \"barbican-db-create-sl9ll\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.078414 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-operator-scripts\") pod \"barbican-aea4-account-create-update-jvpsd\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.098858 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pn4v\" (UniqueName: \"kubernetes.io/projected/084f4a11-5159-49ca-b836-125e788f09e4-kube-api-access-6pn4v\") pod \"barbican-db-create-sl9ll\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.099211 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.101959 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtkq\" (UniqueName: \"kubernetes.io/projected/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-kube-api-access-pmtkq\") pod \"barbican-aea4-account-create-update-jvpsd\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.105980 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-923e-account-create-update-rfpxl"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.107260 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.109206 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.113694 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-923e-account-create-update-rfpxl"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.182464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-combined-ca-bundle\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.182518 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts49t\" (UniqueName: \"kubernetes.io/projected/f7738582-0c74-4445-9939-be3ca1ffeab5-kube-api-access-ts49t\") pod \"cinder-923e-account-create-update-rfpxl\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.182577 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-config-data\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.182613 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7738582-0c74-4445-9939-be3ca1ffeab5-operator-scripts\") pod \"cinder-923e-account-create-update-rfpxl\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.182657 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-227wj\" (UniqueName: \"kubernetes.io/projected/d8c5179b-9c10-4987-88cf-ba72ee746480-kube-api-access-227wj\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.187078 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-combined-ca-bundle\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.187246 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.187522 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-config-data\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.202212 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-227wj\" (UniqueName: \"kubernetes.io/projected/d8c5179b-9c10-4987-88cf-ba72ee746480-kube-api-access-227wj\") pod \"keystone-db-sync-tqb6c\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.227979 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.276148 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8kcdv"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.277373 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.284520 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts49t\" (UniqueName: \"kubernetes.io/projected/f7738582-0c74-4445-9939-be3ca1ffeab5-kube-api-access-ts49t\") pod \"cinder-923e-account-create-update-rfpxl\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.284629 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7738582-0c74-4445-9939-be3ca1ffeab5-operator-scripts\") pod \"cinder-923e-account-create-update-rfpxl\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.285444 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7738582-0c74-4445-9939-be3ca1ffeab5-operator-scripts\") pod \"cinder-923e-account-create-update-rfpxl\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.285713 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-62cb-account-create-update-gdbqf"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.286696 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.290227 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.306120 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8kcdv"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.315318 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts49t\" (UniqueName: \"kubernetes.io/projected/f7738582-0c74-4445-9939-be3ca1ffeab5-kube-api-access-ts49t\") pod \"cinder-923e-account-create-update-rfpxl\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.374191 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.376161 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-62cb-account-create-update-gdbqf"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.386197 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1089ecc0-fd27-494f-905f-a4cd117e66dd-operator-scripts\") pod \"neutron-db-create-8kcdv\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.386267 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtpw2\" (UniqueName: \"kubernetes.io/projected/1089ecc0-fd27-494f-905f-a4cd117e66dd-kube-api-access-wtpw2\") pod \"neutron-db-create-8kcdv\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.386356 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-operator-scripts\") pod \"neutron-62cb-account-create-update-gdbqf\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.386428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t68qc\" (UniqueName: \"kubernetes.io/projected/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-kube-api-access-t68qc\") pod \"neutron-62cb-account-create-update-gdbqf\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.488539 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1089ecc0-fd27-494f-905f-a4cd117e66dd-operator-scripts\") pod \"neutron-db-create-8kcdv\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.488599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtpw2\" (UniqueName: \"kubernetes.io/projected/1089ecc0-fd27-494f-905f-a4cd117e66dd-kube-api-access-wtpw2\") pod \"neutron-db-create-8kcdv\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.488676 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-operator-scripts\") pod \"neutron-62cb-account-create-update-gdbqf\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.488740 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t68qc\" (UniqueName: \"kubernetes.io/projected/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-kube-api-access-t68qc\") pod \"neutron-62cb-account-create-update-gdbqf\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.490249 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1089ecc0-fd27-494f-905f-a4cd117e66dd-operator-scripts\") pod \"neutron-db-create-8kcdv\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.490249 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-operator-scripts\") pod \"neutron-62cb-account-create-update-gdbqf\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.507939 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t68qc\" (UniqueName: \"kubernetes.io/projected/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-kube-api-access-t68qc\") pod \"neutron-62cb-account-create-update-gdbqf\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.510486 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtpw2\" (UniqueName: \"kubernetes.io/projected/1089ecc0-fd27-494f-905f-a4cd117e66dd-kube-api-access-wtpw2\") pod \"neutron-db-create-8kcdv\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.546670 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.594581 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.602699 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8cx6g" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.605821 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.705765 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-config-data\") pod \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.705811 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp7tg\" (UniqueName: \"kubernetes.io/projected/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-kube-api-access-tp7tg\") pod \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.705881 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-combined-ca-bundle\") pod \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.718980 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-kube-api-access-tp7tg" (OuterVolumeSpecName: "kube-api-access-tp7tg") pod "b917ac6a-dcd3-46e7-b4a5-65e7a5622959" (UID: "b917ac6a-dcd3-46e7-b4a5-65e7a5622959"). InnerVolumeSpecName "kube-api-access-tp7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.721192 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-db-sync-config-data\") pod \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\" (UID: \"b917ac6a-dcd3-46e7-b4a5-65e7a5622959\") " Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.721867 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp7tg\" (UniqueName: \"kubernetes.io/projected/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-kube-api-access-tp7tg\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.732880 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b917ac6a-dcd3-46e7-b4a5-65e7a5622959" (UID: "b917ac6a-dcd3-46e7-b4a5-65e7a5622959"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.752667 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b917ac6a-dcd3-46e7-b4a5-65e7a5622959" (UID: "b917ac6a-dcd3-46e7-b4a5-65e7a5622959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.800186 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-config-data" (OuterVolumeSpecName: "config-data") pod "b917ac6a-dcd3-46e7-b4a5-65e7a5622959" (UID: "b917ac6a-dcd3-46e7-b4a5-65e7a5622959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.823243 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.823286 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.823296 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b917ac6a-dcd3-46e7-b4a5-65e7a5622959-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.857038 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cx6qq"] Mar 11 09:34:19 crc kubenswrapper[4830]: I0311 09:34:19.872796 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-sl9ll"] Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:19.953068 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-tqb6c"] Mar 11 09:34:20 crc kubenswrapper[4830]: W0311 09:34:19.972612 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8c5179b_9c10_4987_88cf_ba72ee746480.slice/crio-f36fe10c36dde0f6d8d9bf8b49075d3e2bd2c2e778b6c0dc15dcfe698aad54ae WatchSource:0}: Error finding container f36fe10c36dde0f6d8d9bf8b49075d3e2bd2c2e778b6c0dc15dcfe698aad54ae: Status 404 returned error can't find the container with id f36fe10c36dde0f6d8d9bf8b49075d3e2bd2c2e778b6c0dc15dcfe698aad54ae Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:19.980966 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aea4-account-create-update-jvpsd"] Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.029364 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-923e-account-create-update-rfpxl"] Mar 11 09:34:20 crc kubenswrapper[4830]: W0311 09:34:20.035627 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7738582_0c74_4445_9939_be3ca1ffeab5.slice/crio-3e2a3765f35457d0115291b7ca69c5a5d08fc84909715b6db465a59a13629df5 WatchSource:0}: Error finding container 3e2a3765f35457d0115291b7ca69c5a5d08fc84909715b6db465a59a13629df5: Status 404 returned error can't find the container with id 3e2a3765f35457d0115291b7ca69c5a5d08fc84909715b6db465a59a13629df5 Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.063776 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aea4-account-create-update-jvpsd" event={"ID":"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d","Type":"ContainerStarted","Data":"577a883b34095e558fad1828669d49bc28003057a51724e62d7a02919e08ec9c"} Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.066043 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8cx6g" event={"ID":"b917ac6a-dcd3-46e7-b4a5-65e7a5622959","Type":"ContainerDied","Data":"3a133d0155668adf49798b06d8666e16c6e46e7ba487e407ca55ee45dbaaec97"} Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.066072 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8cx6g" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.066073 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a133d0155668adf49798b06d8666e16c6e46e7ba487e407ca55ee45dbaaec97" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.068043 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sl9ll" event={"ID":"084f4a11-5159-49ca-b836-125e788f09e4","Type":"ContainerStarted","Data":"cd00a0cbb73e2a33e4f6ea1cfe7c4a4c67d76952473c582f745951dccd4f729f"} Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.069090 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tqb6c" event={"ID":"d8c5179b-9c10-4987-88cf-ba72ee746480","Type":"ContainerStarted","Data":"f36fe10c36dde0f6d8d9bf8b49075d3e2bd2c2e778b6c0dc15dcfe698aad54ae"} Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.071059 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cx6qq" event={"ID":"7808b223-41ad-41c0-96f6-d7434ce65017","Type":"ContainerStarted","Data":"ff4083b75a4188553e7c958a86d49eb6e1c6a4bb59a339a4cb88d884d208ff3c"} Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.072276 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-923e-account-create-update-rfpxl" event={"ID":"f7738582-0c74-4445-9939-be3ca1ffeab5","Type":"ContainerStarted","Data":"3e2a3765f35457d0115291b7ca69c5a5d08fc84909715b6db465a59a13629df5"} Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.300488 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8kcdv"] Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.647571 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jxw5l"] Mar 11 09:34:20 crc kubenswrapper[4830]: E0311 09:34:20.647983 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b917ac6a-dcd3-46e7-b4a5-65e7a5622959" containerName="glance-db-sync" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.647998 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b917ac6a-dcd3-46e7-b4a5-65e7a5622959" containerName="glance-db-sync" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.648287 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b917ac6a-dcd3-46e7-b4a5-65e7a5622959" containerName="glance-db-sync" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.649327 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.672249 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jxw5l"] Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.775438 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5977\" (UniqueName: \"kubernetes.io/projected/aad4e154-c667-41c1-9bf1-bf53a07a15b1-kube-api-access-t5977\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.775858 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-config\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.775890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.775934 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.775998 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-dns-svc\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.877629 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.877712 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-dns-svc\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.877766 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5977\" (UniqueName: \"kubernetes.io/projected/aad4e154-c667-41c1-9bf1-bf53a07a15b1-kube-api-access-t5977\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.877810 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-config\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.877836 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.878836 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.878950 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.879646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-config\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.879722 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-dns-svc\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.898136 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5977\" (UniqueName: \"kubernetes.io/projected/aad4e154-c667-41c1-9bf1-bf53a07a15b1-kube-api-access-t5977\") pod \"dnsmasq-dns-74dc88fc-jxw5l\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:20 crc kubenswrapper[4830]: I0311 09:34:20.999609 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.051855 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-62cb-account-create-update-gdbqf"] Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.117689 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-923e-account-create-update-rfpxl" event={"ID":"f7738582-0c74-4445-9939-be3ca1ffeab5","Type":"ContainerStarted","Data":"2a1098b6416bb8e6490be29e9e9f3e41dbec751bef400305e9eb19fba004dde2"} Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.129787 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aea4-account-create-update-jvpsd" event={"ID":"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d","Type":"ContainerStarted","Data":"67fe025f08ec4f09f32650547a8fbea269b7b6a32235514c5e9df457df319df0"} Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.139395 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8kcdv" event={"ID":"1089ecc0-fd27-494f-905f-a4cd117e66dd","Type":"ContainerStarted","Data":"16eae6fccdfcb7fd43b09153a9a055ce2d03e9092b8ddf4129f1d48d361561e5"} Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.139437 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8kcdv" event={"ID":"1089ecc0-fd27-494f-905f-a4cd117e66dd","Type":"ContainerStarted","Data":"d125191d89403d86227fe73004dcb7bda6be4b372c2bab019581d453ded0a773"} Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.142985 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-923e-account-create-update-rfpxl" podStartSLOduration=2.142970972 podStartE2EDuration="2.142970972s" podCreationTimestamp="2026-03-11 09:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:21.137885491 +0000 UTC m=+1228.919036190" watchObservedRunningTime="2026-03-11 09:34:21.142970972 +0000 UTC m=+1228.924121661" Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.145452 4830 generic.go:334] "Generic (PLEG): container finished" podID="084f4a11-5159-49ca-b836-125e788f09e4" containerID="b475842f9f3725012fc6b65e111cdf6b68a11b01e30facf22311a5c6239167f7" exitCode=0 Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.145513 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sl9ll" event={"ID":"084f4a11-5159-49ca-b836-125e788f09e4","Type":"ContainerDied","Data":"b475842f9f3725012fc6b65e111cdf6b68a11b01e30facf22311a5c6239167f7"} Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.155496 4830 generic.go:334] "Generic (PLEG): container finished" podID="7808b223-41ad-41c0-96f6-d7434ce65017" containerID="252d8d3c729a9b19746159a6e4d6ff0f149bcd7841396e2f189f931a6e18fe2f" exitCode=0 Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.155544 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cx6qq" event={"ID":"7808b223-41ad-41c0-96f6-d7434ce65017","Type":"ContainerDied","Data":"252d8d3c729a9b19746159a6e4d6ff0f149bcd7841396e2f189f931a6e18fe2f"} Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.162056 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-aea4-account-create-update-jvpsd" podStartSLOduration=3.162004069 podStartE2EDuration="3.162004069s" podCreationTimestamp="2026-03-11 09:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:21.153514014 +0000 UTC m=+1228.934664723" watchObservedRunningTime="2026-03-11 09:34:21.162004069 +0000 UTC m=+1228.943154758" Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.189611 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-8kcdv" podStartSLOduration=2.189590914 podStartE2EDuration="2.189590914s" podCreationTimestamp="2026-03-11 09:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:21.188057042 +0000 UTC m=+1228.969207741" watchObservedRunningTime="2026-03-11 09:34:21.189590914 +0000 UTC m=+1228.970741613" Mar 11 09:34:21 crc kubenswrapper[4830]: I0311 09:34:21.633653 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jxw5l"] Mar 11 09:34:21 crc kubenswrapper[4830]: W0311 09:34:21.640844 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaad4e154_c667_41c1_9bf1_bf53a07a15b1.slice/crio-9c8894f7c9b6918c2e123f8d8d1fb556e043723dc789271b86c9b9875e8cf7fa WatchSource:0}: Error finding container 9c8894f7c9b6918c2e123f8d8d1fb556e043723dc789271b86c9b9875e8cf7fa: Status 404 returned error can't find the container with id 9c8894f7c9b6918c2e123f8d8d1fb556e043723dc789271b86c9b9875e8cf7fa Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.171934 4830 generic.go:334] "Generic (PLEG): container finished" podID="96185ebd-9440-48a9-b1b6-0674ab5d4bb5" containerID="4d9607bce44ec9b425d982614df1376eea12961ef4cbaf3f9b8512ab32f5133f" exitCode=0 Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.172147 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-62cb-account-create-update-gdbqf" event={"ID":"96185ebd-9440-48a9-b1b6-0674ab5d4bb5","Type":"ContainerDied","Data":"4d9607bce44ec9b425d982614df1376eea12961ef4cbaf3f9b8512ab32f5133f"} Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.172431 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-62cb-account-create-update-gdbqf" event={"ID":"96185ebd-9440-48a9-b1b6-0674ab5d4bb5","Type":"ContainerStarted","Data":"baba53437dc3be85737a6516bee83c9be3086aabb4bbb05dbe8795ea4b069d2a"} Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.175170 4830 generic.go:334] "Generic (PLEG): container finished" podID="f7738582-0c74-4445-9939-be3ca1ffeab5" containerID="2a1098b6416bb8e6490be29e9e9f3e41dbec751bef400305e9eb19fba004dde2" exitCode=0 Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.175293 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-923e-account-create-update-rfpxl" event={"ID":"f7738582-0c74-4445-9939-be3ca1ffeab5","Type":"ContainerDied","Data":"2a1098b6416bb8e6490be29e9e9f3e41dbec751bef400305e9eb19fba004dde2"} Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.177561 4830 generic.go:334] "Generic (PLEG): container finished" podID="1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d" containerID="67fe025f08ec4f09f32650547a8fbea269b7b6a32235514c5e9df457df319df0" exitCode=0 Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.177617 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aea4-account-create-update-jvpsd" event={"ID":"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d","Type":"ContainerDied","Data":"67fe025f08ec4f09f32650547a8fbea269b7b6a32235514c5e9df457df319df0"} Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.180175 4830 generic.go:334] "Generic (PLEG): container finished" podID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerID="50208acc0f85fbd28984ce3f29c0c7ccc77a2c0ea461d3f727357bff380188aa" exitCode=0 Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.180258 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" event={"ID":"aad4e154-c667-41c1-9bf1-bf53a07a15b1","Type":"ContainerDied","Data":"50208acc0f85fbd28984ce3f29c0c7ccc77a2c0ea461d3f727357bff380188aa"} Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.180306 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" event={"ID":"aad4e154-c667-41c1-9bf1-bf53a07a15b1","Type":"ContainerStarted","Data":"9c8894f7c9b6918c2e123f8d8d1fb556e043723dc789271b86c9b9875e8cf7fa"} Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.188137 4830 generic.go:334] "Generic (PLEG): container finished" podID="1089ecc0-fd27-494f-905f-a4cd117e66dd" containerID="16eae6fccdfcb7fd43b09153a9a055ce2d03e9092b8ddf4129f1d48d361561e5" exitCode=0 Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.188641 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8kcdv" event={"ID":"1089ecc0-fd27-494f-905f-a4cd117e66dd","Type":"ContainerDied","Data":"16eae6fccdfcb7fd43b09153a9a055ce2d03e9092b8ddf4129f1d48d361561e5"} Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.591720 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.615412 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.719919 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084f4a11-5159-49ca-b836-125e788f09e4-operator-scripts\") pod \"084f4a11-5159-49ca-b836-125e788f09e4\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.719994 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7808b223-41ad-41c0-96f6-d7434ce65017-operator-scripts\") pod \"7808b223-41ad-41c0-96f6-d7434ce65017\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.720042 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b4sp\" (UniqueName: \"kubernetes.io/projected/7808b223-41ad-41c0-96f6-d7434ce65017-kube-api-access-6b4sp\") pod \"7808b223-41ad-41c0-96f6-d7434ce65017\" (UID: \"7808b223-41ad-41c0-96f6-d7434ce65017\") " Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.720088 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pn4v\" (UniqueName: \"kubernetes.io/projected/084f4a11-5159-49ca-b836-125e788f09e4-kube-api-access-6pn4v\") pod \"084f4a11-5159-49ca-b836-125e788f09e4\" (UID: \"084f4a11-5159-49ca-b836-125e788f09e4\") " Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.720877 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/084f4a11-5159-49ca-b836-125e788f09e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "084f4a11-5159-49ca-b836-125e788f09e4" (UID: "084f4a11-5159-49ca-b836-125e788f09e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.721111 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7808b223-41ad-41c0-96f6-d7434ce65017-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7808b223-41ad-41c0-96f6-d7434ce65017" (UID: "7808b223-41ad-41c0-96f6-d7434ce65017"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.724997 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7808b223-41ad-41c0-96f6-d7434ce65017-kube-api-access-6b4sp" (OuterVolumeSpecName: "kube-api-access-6b4sp") pod "7808b223-41ad-41c0-96f6-d7434ce65017" (UID: "7808b223-41ad-41c0-96f6-d7434ce65017"). InnerVolumeSpecName "kube-api-access-6b4sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.725118 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084f4a11-5159-49ca-b836-125e788f09e4-kube-api-access-6pn4v" (OuterVolumeSpecName: "kube-api-access-6pn4v") pod "084f4a11-5159-49ca-b836-125e788f09e4" (UID: "084f4a11-5159-49ca-b836-125e788f09e4"). InnerVolumeSpecName "kube-api-access-6pn4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.821839 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/084f4a11-5159-49ca-b836-125e788f09e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.821875 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7808b223-41ad-41c0-96f6-d7434ce65017-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.821884 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b4sp\" (UniqueName: \"kubernetes.io/projected/7808b223-41ad-41c0-96f6-d7434ce65017-kube-api-access-6b4sp\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:22 crc kubenswrapper[4830]: I0311 09:34:22.821895 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pn4v\" (UniqueName: \"kubernetes.io/projected/084f4a11-5159-49ca-b836-125e788f09e4-kube-api-access-6pn4v\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.204449 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" event={"ID":"aad4e154-c667-41c1-9bf1-bf53a07a15b1","Type":"ContainerStarted","Data":"165f49ac949ccec8e4ec006d3a7832a727b81f47d10d2c1d8969e1b392ea9bd6"} Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.207563 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-sl9ll" event={"ID":"084f4a11-5159-49ca-b836-125e788f09e4","Type":"ContainerDied","Data":"cd00a0cbb73e2a33e4f6ea1cfe7c4a4c67d76952473c582f745951dccd4f729f"} Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.207620 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd00a0cbb73e2a33e4f6ea1cfe7c4a4c67d76952473c582f745951dccd4f729f" Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.207594 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-sl9ll" Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.213528 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cx6qq" Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.214216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cx6qq" event={"ID":"7808b223-41ad-41c0-96f6-d7434ce65017","Type":"ContainerDied","Data":"ff4083b75a4188553e7c958a86d49eb6e1c6a4bb59a339a4cb88d884d208ff3c"} Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.214251 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff4083b75a4188553e7c958a86d49eb6e1c6a4bb59a339a4cb88d884d208ff3c" Mar 11 09:34:23 crc kubenswrapper[4830]: I0311 09:34:23.234988 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" podStartSLOduration=3.234970207 podStartE2EDuration="3.234970207s" podCreationTimestamp="2026-03-11 09:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:23.228312003 +0000 UTC m=+1231.009462702" watchObservedRunningTime="2026-03-11 09:34:23.234970207 +0000 UTC m=+1231.016120896" Mar 11 09:34:24 crc kubenswrapper[4830]: I0311 09:34:24.228514 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:24 crc kubenswrapper[4830]: I0311 09:34:24.250135 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:34:24 crc kubenswrapper[4830]: I0311 09:34:24.269922 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/db4bedf3-ea20-4a63-9623-96286e9b243b-etc-swift\") pod \"swift-storage-0\" (UID: \"db4bedf3-ea20-4a63-9623-96286e9b243b\") " pod="openstack/swift-storage-0" Mar 11 09:34:24 crc kubenswrapper[4830]: I0311 09:34:24.335292 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.197053 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.216698 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.279224 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.285701 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-62cb-account-create-update-gdbqf" event={"ID":"96185ebd-9440-48a9-b1b6-0674ab5d4bb5","Type":"ContainerDied","Data":"baba53437dc3be85737a6516bee83c9be3086aabb4bbb05dbe8795ea4b069d2a"} Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.285751 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baba53437dc3be85737a6516bee83c9be3086aabb4bbb05dbe8795ea4b069d2a" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.285822 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-62cb-account-create-update-gdbqf" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.295602 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-operator-scripts\") pod \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.295655 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1089ecc0-fd27-494f-905f-a4cd117e66dd-operator-scripts\") pod \"1089ecc0-fd27-494f-905f-a4cd117e66dd\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.295755 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t68qc\" (UniqueName: \"kubernetes.io/projected/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-kube-api-access-t68qc\") pod \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\" (UID: \"96185ebd-9440-48a9-b1b6-0674ab5d4bb5\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.295847 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtpw2\" (UniqueName: \"kubernetes.io/projected/1089ecc0-fd27-494f-905f-a4cd117e66dd-kube-api-access-wtpw2\") pod \"1089ecc0-fd27-494f-905f-a4cd117e66dd\" (UID: \"1089ecc0-fd27-494f-905f-a4cd117e66dd\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.296823 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1089ecc0-fd27-494f-905f-a4cd117e66dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1089ecc0-fd27-494f-905f-a4cd117e66dd" (UID: "1089ecc0-fd27-494f-905f-a4cd117e66dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.296863 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96185ebd-9440-48a9-b1b6-0674ab5d4bb5" (UID: "96185ebd-9440-48a9-b1b6-0674ab5d4bb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.301664 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-923e-account-create-update-rfpxl" event={"ID":"f7738582-0c74-4445-9939-be3ca1ffeab5","Type":"ContainerDied","Data":"3e2a3765f35457d0115291b7ca69c5a5d08fc84909715b6db465a59a13629df5"} Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.301721 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e2a3765f35457d0115291b7ca69c5a5d08fc84909715b6db465a59a13629df5" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.301852 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-923e-account-create-update-rfpxl" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.312095 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-kube-api-access-t68qc" (OuterVolumeSpecName: "kube-api-access-t68qc") pod "96185ebd-9440-48a9-b1b6-0674ab5d4bb5" (UID: "96185ebd-9440-48a9-b1b6-0674ab5d4bb5"). InnerVolumeSpecName "kube-api-access-t68qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.330234 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1089ecc0-fd27-494f-905f-a4cd117e66dd-kube-api-access-wtpw2" (OuterVolumeSpecName: "kube-api-access-wtpw2") pod "1089ecc0-fd27-494f-905f-a4cd117e66dd" (UID: "1089ecc0-fd27-494f-905f-a4cd117e66dd"). InnerVolumeSpecName "kube-api-access-wtpw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.336401 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.343050 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aea4-account-create-update-jvpsd" event={"ID":"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d","Type":"ContainerDied","Data":"577a883b34095e558fad1828669d49bc28003057a51724e62d7a02919e08ec9c"} Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.343090 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="577a883b34095e558fad1828669d49bc28003057a51724e62d7a02919e08ec9c" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.354317 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8kcdv" event={"ID":"1089ecc0-fd27-494f-905f-a4cd117e66dd","Type":"ContainerDied","Data":"d125191d89403d86227fe73004dcb7bda6be4b372c2bab019581d453ded0a773"} Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.354362 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d125191d89403d86227fe73004dcb7bda6be4b372c2bab019581d453ded0a773" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.354424 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kcdv" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.397789 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts49t\" (UniqueName: \"kubernetes.io/projected/f7738582-0c74-4445-9939-be3ca1ffeab5-kube-api-access-ts49t\") pod \"f7738582-0c74-4445-9939-be3ca1ffeab5\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.398071 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7738582-0c74-4445-9939-be3ca1ffeab5-operator-scripts\") pod \"f7738582-0c74-4445-9939-be3ca1ffeab5\" (UID: \"f7738582-0c74-4445-9939-be3ca1ffeab5\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.398228 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtkq\" (UniqueName: \"kubernetes.io/projected/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-kube-api-access-pmtkq\") pod \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.398344 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-operator-scripts\") pod \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\" (UID: \"1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d\") " Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.398571 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7738582-0c74-4445-9939-be3ca1ffeab5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7738582-0c74-4445-9939-be3ca1ffeab5" (UID: "f7738582-0c74-4445-9939-be3ca1ffeab5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.398778 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t68qc\" (UniqueName: \"kubernetes.io/projected/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-kube-api-access-t68qc\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.398835 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtpw2\" (UniqueName: \"kubernetes.io/projected/1089ecc0-fd27-494f-905f-a4cd117e66dd-kube-api-access-wtpw2\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.399047 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96185ebd-9440-48a9-b1b6-0674ab5d4bb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.399121 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7738582-0c74-4445-9939-be3ca1ffeab5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.399184 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1089ecc0-fd27-494f-905f-a4cd117e66dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.399051 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d" (UID: "1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.415635 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7738582-0c74-4445-9939-be3ca1ffeab5-kube-api-access-ts49t" (OuterVolumeSpecName: "kube-api-access-ts49t") pod "f7738582-0c74-4445-9939-be3ca1ffeab5" (UID: "f7738582-0c74-4445-9939-be3ca1ffeab5"). InnerVolumeSpecName "kube-api-access-ts49t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.430200 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-kube-api-access-pmtkq" (OuterVolumeSpecName: "kube-api-access-pmtkq") pod "1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d" (UID: "1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d"). InnerVolumeSpecName "kube-api-access-pmtkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.497885 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.501004 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts49t\" (UniqueName: \"kubernetes.io/projected/f7738582-0c74-4445-9939-be3ca1ffeab5-kube-api-access-ts49t\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.501055 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtkq\" (UniqueName: \"kubernetes.io/projected/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-kube-api-access-pmtkq\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: I0311 09:34:26.501065 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:26 crc kubenswrapper[4830]: W0311 09:34:26.505932 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb4bedf3_ea20_4a63_9623_96286e9b243b.slice/crio-a7a003abd2c8fde13dbe5f2bf0653d8707afa93013dae2b5025901557429e8c1 WatchSource:0}: Error finding container a7a003abd2c8fde13dbe5f2bf0653d8707afa93013dae2b5025901557429e8c1: Status 404 returned error can't find the container with id a7a003abd2c8fde13dbe5f2bf0653d8707afa93013dae2b5025901557429e8c1 Mar 11 09:34:27 crc kubenswrapper[4830]: I0311 09:34:27.366237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"a7a003abd2c8fde13dbe5f2bf0653d8707afa93013dae2b5025901557429e8c1"} Mar 11 09:34:27 crc kubenswrapper[4830]: I0311 09:34:27.368442 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aea4-account-create-update-jvpsd" Mar 11 09:34:27 crc kubenswrapper[4830]: I0311 09:34:27.370117 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tqb6c" event={"ID":"d8c5179b-9c10-4987-88cf-ba72ee746480","Type":"ContainerStarted","Data":"c2bbbf9d018f0d46fe420e6e794667233782c051f9b4092216561b0d91f7fa0b"} Mar 11 09:34:27 crc kubenswrapper[4830]: I0311 09:34:27.396884 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-tqb6c" podStartSLOduration=2.369328284 podStartE2EDuration="8.396861963s" podCreationTimestamp="2026-03-11 09:34:19 +0000 UTC" firstStartedPulling="2026-03-11 09:34:19.978141977 +0000 UTC m=+1227.759292666" lastFinishedPulling="2026-03-11 09:34:26.005675656 +0000 UTC m=+1233.786826345" observedRunningTime="2026-03-11 09:34:27.39242503 +0000 UTC m=+1235.173575729" watchObservedRunningTime="2026-03-11 09:34:27.396861963 +0000 UTC m=+1235.178012672" Mar 11 09:34:29 crc kubenswrapper[4830]: I0311 09:34:29.400294 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"25973ba6ec65244d6351e8c91f61b3b3678d567cbafa2284d6df4ec98f6b865a"} Mar 11 09:34:29 crc kubenswrapper[4830]: I0311 09:34:29.400940 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"ec865f183362dd2409e4cab3b828a1082bbe6d77e53b0300e55d8d97decf7c2d"} Mar 11 09:34:29 crc kubenswrapper[4830]: I0311 09:34:29.400958 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"f65754035816cd661f4e7fa2f1a857a75d5214b583516d0b9c2ce20bf964c002"} Mar 11 09:34:29 crc kubenswrapper[4830]: I0311 09:34:29.400995 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"80b8ef82ad555251329e3962cd6c5954c9235fe5ace4e916b887292a81296b69"} Mar 11 09:34:30 crc kubenswrapper[4830]: I0311 09:34:30.411988 4830 generic.go:334] "Generic (PLEG): container finished" podID="d8c5179b-9c10-4987-88cf-ba72ee746480" containerID="c2bbbf9d018f0d46fe420e6e794667233782c051f9b4092216561b0d91f7fa0b" exitCode=0 Mar 11 09:34:30 crc kubenswrapper[4830]: I0311 09:34:30.412063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tqb6c" event={"ID":"d8c5179b-9c10-4987-88cf-ba72ee746480","Type":"ContainerDied","Data":"c2bbbf9d018f0d46fe420e6e794667233782c051f9b4092216561b0d91f7fa0b"} Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.001212 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.055219 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-4hkw2"] Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.055667 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerName="dnsmasq-dns" containerID="cri-o://a6d584156d01702725959c856988ee13f058d3221156dd9a2cc836c7abf66ccf" gracePeriod=10 Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.429248 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"182b32bf518e3faab6c48c761908b566ed509825aaaa30276c2a0a57b12195db"} Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.429609 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"b866b16cca9f65bc8d04c61b5a9bb1a6023749a4f9ca8620db40d57c5c49203a"} Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.429648 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"4fcaad067fd88a0fb01bb2e866c1694f2703553688ef508ab82997fc33028d9a"} Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.429658 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"6e252ea904a0760a536f3561f5db6363af3ceaf542fae1c6efdf9a99ef77da76"} Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.432487 4830 generic.go:334] "Generic (PLEG): container finished" podID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerID="a6d584156d01702725959c856988ee13f058d3221156dd9a2cc836c7abf66ccf" exitCode=0 Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.433130 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" event={"ID":"6d65cbea-4a39-478f-91b1-60e6dd72b135","Type":"ContainerDied","Data":"a6d584156d01702725959c856988ee13f058d3221156dd9a2cc836c7abf66ccf"} Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.554737 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.706604 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-sb\") pod \"6d65cbea-4a39-478f-91b1-60e6dd72b135\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.706669 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-config\") pod \"6d65cbea-4a39-478f-91b1-60e6dd72b135\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.706705 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-nb\") pod \"6d65cbea-4a39-478f-91b1-60e6dd72b135\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.706802 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-dns-svc\") pod \"6d65cbea-4a39-478f-91b1-60e6dd72b135\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.706844 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6slm\" (UniqueName: \"kubernetes.io/projected/6d65cbea-4a39-478f-91b1-60e6dd72b135-kube-api-access-f6slm\") pod \"6d65cbea-4a39-478f-91b1-60e6dd72b135\" (UID: \"6d65cbea-4a39-478f-91b1-60e6dd72b135\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.770092 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d65cbea-4a39-478f-91b1-60e6dd72b135-kube-api-access-f6slm" (OuterVolumeSpecName: "kube-api-access-f6slm") pod "6d65cbea-4a39-478f-91b1-60e6dd72b135" (UID: "6d65cbea-4a39-478f-91b1-60e6dd72b135"). InnerVolumeSpecName "kube-api-access-f6slm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.809775 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6slm\" (UniqueName: \"kubernetes.io/projected/6d65cbea-4a39-478f-91b1-60e6dd72b135-kube-api-access-f6slm\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.845152 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.918303 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-227wj\" (UniqueName: \"kubernetes.io/projected/d8c5179b-9c10-4987-88cf-ba72ee746480-kube-api-access-227wj\") pod \"d8c5179b-9c10-4987-88cf-ba72ee746480\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.919365 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-combined-ca-bundle\") pod \"d8c5179b-9c10-4987-88cf-ba72ee746480\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.919424 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-config-data\") pod \"d8c5179b-9c10-4987-88cf-ba72ee746480\" (UID: \"d8c5179b-9c10-4987-88cf-ba72ee746480\") " Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.922914 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c5179b-9c10-4987-88cf-ba72ee746480-kube-api-access-227wj" (OuterVolumeSpecName: "kube-api-access-227wj") pod "d8c5179b-9c10-4987-88cf-ba72ee746480" (UID: "d8c5179b-9c10-4987-88cf-ba72ee746480"). InnerVolumeSpecName "kube-api-access-227wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.940524 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-config" (OuterVolumeSpecName: "config") pod "6d65cbea-4a39-478f-91b1-60e6dd72b135" (UID: "6d65cbea-4a39-478f-91b1-60e6dd72b135"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.940700 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8c5179b-9c10-4987-88cf-ba72ee746480" (UID: "d8c5179b-9c10-4987-88cf-ba72ee746480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.960950 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-config-data" (OuterVolumeSpecName: "config-data") pod "d8c5179b-9c10-4987-88cf-ba72ee746480" (UID: "d8c5179b-9c10-4987-88cf-ba72ee746480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.962955 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d65cbea-4a39-478f-91b1-60e6dd72b135" (UID: "6d65cbea-4a39-478f-91b1-60e6dd72b135"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:31 crc kubenswrapper[4830]: I0311 09:34:31.994575 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d65cbea-4a39-478f-91b1-60e6dd72b135" (UID: "6d65cbea-4a39-478f-91b1-60e6dd72b135"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.008183 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d65cbea-4a39-478f-91b1-60e6dd72b135" (UID: "6d65cbea-4a39-478f-91b1-60e6dd72b135"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.021700 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.021821 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.021898 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.021968 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-227wj\" (UniqueName: \"kubernetes.io/projected/d8c5179b-9c10-4987-88cf-ba72ee746480-kube-api-access-227wj\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.022115 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.022187 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c5179b-9c10-4987-88cf-ba72ee746480-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.022260 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d65cbea-4a39-478f-91b1-60e6dd72b135-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.441551 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" event={"ID":"6d65cbea-4a39-478f-91b1-60e6dd72b135","Type":"ContainerDied","Data":"631900113de4d14ee4ac239ca5e7198b3bab88e11a11bd68539d527c81e0cebf"} Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.441970 4830 scope.go:117] "RemoveContainer" containerID="a6d584156d01702725959c856988ee13f058d3221156dd9a2cc836c7abf66ccf" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.441865 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.445522 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-tqb6c" event={"ID":"d8c5179b-9c10-4987-88cf-ba72ee746480","Type":"ContainerDied","Data":"f36fe10c36dde0f6d8d9bf8b49075d3e2bd2c2e778b6c0dc15dcfe698aad54ae"} Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.445567 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f36fe10c36dde0f6d8d9bf8b49075d3e2bd2c2e778b6c0dc15dcfe698aad54ae" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.445605 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-tqb6c" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.488628 4830 scope.go:117] "RemoveContainer" containerID="ba4e9de0692fbd8a3d2d164ab4dde81fd85ae4991a0736e46a21f75f11d3102f" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.498804 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-4hkw2"] Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.505148 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-4hkw2"] Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.641741 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-r8pdd"] Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642374 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1089ecc0-fd27-494f-905f-a4cd117e66dd" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642402 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1089ecc0-fd27-494f-905f-a4cd117e66dd" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642420 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7738582-0c74-4445-9939-be3ca1ffeab5" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642428 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7738582-0c74-4445-9939-be3ca1ffeab5" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642439 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerName="init" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642445 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerName="init" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642454 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642460 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642473 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084f4a11-5159-49ca-b836-125e788f09e4" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642481 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="084f4a11-5159-49ca-b836-125e788f09e4" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642489 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7808b223-41ad-41c0-96f6-d7434ce65017" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642496 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7808b223-41ad-41c0-96f6-d7434ce65017" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642506 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96185ebd-9440-48a9-b1b6-0674ab5d4bb5" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642513 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="96185ebd-9440-48a9-b1b6-0674ab5d4bb5" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642527 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c5179b-9c10-4987-88cf-ba72ee746480" containerName="keystone-db-sync" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642533 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c5179b-9c10-4987-88cf-ba72ee746480" containerName="keystone-db-sync" Mar 11 09:34:32 crc kubenswrapper[4830]: E0311 09:34:32.642553 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerName="dnsmasq-dns" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642559 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerName="dnsmasq-dns" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642720 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="96185ebd-9440-48a9-b1b6-0674ab5d4bb5" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642737 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1089ecc0-fd27-494f-905f-a4cd117e66dd" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642749 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7738582-0c74-4445-9939-be3ca1ffeab5" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642761 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="084f4a11-5159-49ca-b836-125e788f09e4" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642773 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7808b223-41ad-41c0-96f6-d7434ce65017" containerName="mariadb-database-create" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642785 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c5179b-9c10-4987-88cf-ba72ee746480" containerName="keystone-db-sync" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642797 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d" containerName="mariadb-account-create-update" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.642815 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerName="dnsmasq-dns" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.644150 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.693101 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-r8pdd"] Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.700192 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p2qw2"] Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.716386 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.722704 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.723182 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q9l9c" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.723515 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.723623 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.723783 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.726708 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p2qw2"] Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.734501 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-dns-svc\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.734546 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.734575 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xzm\" (UniqueName: \"kubernetes.io/projected/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-kube-api-access-g8xzm\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.734591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.734630 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-config\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835611 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-combined-ca-bundle\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835680 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-config\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835709 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-scripts\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835740 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-config-data\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835758 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-fernet-keys\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835809 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-dns-svc\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835829 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-credential-keys\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835849 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835865 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l468\" (UniqueName: \"kubernetes.io/projected/4a2c321a-3e97-44a9-b624-f87d2df01b9c-kube-api-access-9l468\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835886 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xzm\" (UniqueName: \"kubernetes.io/projected/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-kube-api-access-g8xzm\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.835905 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.837285 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.837928 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.838713 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-config\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.840635 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-dns-svc\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.864335 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xzm\" (UniqueName: \"kubernetes.io/projected/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-kube-api-access-g8xzm\") pod \"dnsmasq-dns-7d5679f497-r8pdd\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.940087 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-fernet-keys\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.940186 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-credential-keys\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.940206 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l468\" (UniqueName: \"kubernetes.io/projected/4a2c321a-3e97-44a9-b624-f87d2df01b9c-kube-api-access-9l468\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.940243 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-combined-ca-bundle\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.940283 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-scripts\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.940310 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-config-data\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.951361 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-combined-ca-bundle\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.957925 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-config-data\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.963054 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-fernet-keys\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.963280 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.968563 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-credential-keys\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.969196 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" path="/var/lib/kubelet/pods/6d65cbea-4a39-478f-91b1-60e6dd72b135/volumes" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.971185 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6bdc7dd545-m2jlq"] Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.972544 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:32 crc kubenswrapper[4830]: I0311 09:34:32.986571 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-scripts\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.018926 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.019141 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.019311 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.019418 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-frqhw" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.046971 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e60a5c-559c-4667-adf4-14e8e7066569-logs\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.047053 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-scripts\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.047147 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cflv6\" (UniqueName: \"kubernetes.io/projected/c1e60a5c-559c-4667-adf4-14e8e7066569-kube-api-access-cflv6\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.047216 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-config-data\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.047291 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c1e60a5c-559c-4667-adf4-14e8e7066569-horizon-secret-key\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.058102 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bdc7dd545-m2jlq"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.066665 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l468\" (UniqueName: \"kubernetes.io/projected/4a2c321a-3e97-44a9-b624-f87d2df01b9c-kube-api-access-9l468\") pod \"keystone-bootstrap-p2qw2\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.134010 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lqwp9"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.135409 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.151888 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-config-data\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.151960 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c1e60a5c-559c-4667-adf4-14e8e7066569-horizon-secret-key\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.152017 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e60a5c-559c-4667-adf4-14e8e7066569-logs\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.152104 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-scripts\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.152139 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cflv6\" (UniqueName: \"kubernetes.io/projected/c1e60a5c-559c-4667-adf4-14e8e7066569-kube-api-access-cflv6\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.157210 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e60a5c-559c-4667-adf4-14e8e7066569-logs\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.157976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-scripts\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.158509 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-config-data\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.176489 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mpg29" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.176651 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pl7d8"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.177718 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.186403 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.186648 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.186768 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.187331 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lwgss" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.188155 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c1e60a5c-559c-4667-adf4-14e8e7066569-horizon-secret-key\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.211076 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fwspb"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.212352 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.239826 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.240009 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.240171 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sxs8x" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.255701 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-etc-machine-id\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.255756 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-combined-ca-bundle\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.255777 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-config-data\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.255830 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-combined-ca-bundle\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.255920 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hl6t\" (UniqueName: \"kubernetes.io/projected/045501ed-58bb-4a38-9b4a-5091217cf610-kube-api-access-2hl6t\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.255957 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-db-sync-config-data\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.255983 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-db-sync-config-data\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.256077 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvd7\" (UniqueName: \"kubernetes.io/projected/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-kube-api-access-5bvd7\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.256106 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-scripts\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.266632 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cflv6\" (UniqueName: \"kubernetes.io/projected/c1e60a5c-559c-4667-adf4-14e8e7066569-kube-api-access-cflv6\") pod \"horizon-6bdc7dd545-m2jlq\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.269125 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lqwp9"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.297329 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pl7d8"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.320525 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fwspb"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.346193 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.351657 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.353919 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357327 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hl6t\" (UniqueName: \"kubernetes.io/projected/045501ed-58bb-4a38-9b4a-5091217cf610-kube-api-access-2hl6t\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357380 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-db-sync-config-data\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357407 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-config\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357425 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-db-sync-config-data\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357470 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvd7\" (UniqueName: \"kubernetes.io/projected/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-kube-api-access-5bvd7\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357521 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-scripts\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357555 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-combined-ca-bundle\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357586 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-etc-machine-id\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.357616 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrfb\" (UniqueName: \"kubernetes.io/projected/253897f0-4649-46c8-9bb3-9d25a4864701-kube-api-access-ncrfb\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.358354 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-etc-machine-id\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.362808 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.362979 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.363079 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-combined-ca-bundle\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.363150 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-config-data\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.363217 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-combined-ca-bundle\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.387763 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-config-data\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.388669 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-combined-ca-bundle\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.391472 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.401540 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-combined-ca-bundle\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.405085 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvd7\" (UniqueName: \"kubernetes.io/projected/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-kube-api-access-5bvd7\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.406165 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-scripts\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.410996 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-db-sync-config-data\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.411337 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-db-sync-config-data\") pod \"cinder-db-sync-lqwp9\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.444513 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hl6t\" (UniqueName: \"kubernetes.io/projected/045501ed-58bb-4a38-9b4a-5091217cf610-kube-api-access-2hl6t\") pod \"barbican-db-sync-pl7d8\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.461539 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.471977 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrfb\" (UniqueName: \"kubernetes.io/projected/253897f0-4649-46c8-9bb3-9d25a4864701-kube-api-access-ncrfb\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.472289 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-log-httpd\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.472417 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-scripts\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.472861 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-run-httpd\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.473007 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9r6v\" (UniqueName: \"kubernetes.io/projected/bc3adf05-3cb9-4fda-be48-67b6b3084179-kube-api-access-b9r6v\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.473179 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.473272 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-config\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.473373 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-config-data\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.473575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-combined-ca-bundle\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.474038 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.493232 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.506619 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-config\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.538305 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrfb\" (UniqueName: \"kubernetes.io/projected/253897f0-4649-46c8-9bb3-9d25a4864701-kube-api-access-ncrfb\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.594727 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2wcrf"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.603507 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.606651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-combined-ca-bundle\") pod \"neutron-db-sync-fwspb\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.607044 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.618595 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fwspb" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.619529 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-config-data\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.619811 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.622048 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-log-httpd\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.622171 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-scripts\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.622209 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-run-httpd\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.622350 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9r6v\" (UniqueName: \"kubernetes.io/projected/bc3adf05-3cb9-4fda-be48-67b6b3084179-kube-api-access-b9r6v\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.624009 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.635250 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2wcrf"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.635435 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.635812 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qjp5l" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.635929 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.638921 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-run-httpd\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.639356 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-log-httpd\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.653065 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9r6v\" (UniqueName: \"kubernetes.io/projected/bc3adf05-3cb9-4fda-be48-67b6b3084179-kube-api-access-b9r6v\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.665527 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-config-data\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.668547 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.668943 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.669567 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-scripts\") pod \"ceilometer-0\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.684817 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-r8pdd"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.727963 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-config-data\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.728054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9713cf71-536f-4674-8184-7c7651dad952-logs\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.728092 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-scripts\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.728126 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-combined-ca-bundle\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.728155 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2wk\" (UniqueName: \"kubernetes.io/projected/9713cf71-536f-4674-8184-7c7651dad952-kube-api-access-rg2wk\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.739676 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76bbdbbbb9-9p7rc"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.749381 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.758414 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bbdbbbb9-9p7rc"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.798086 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56798b757f-4m7pp"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.799876 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.810584 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-4m7pp"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.819746 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.822288 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.825722 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.829328 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.829563 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.830986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97kmg\" (UniqueName: \"kubernetes.io/projected/edd9483b-68c8-4e5e-a562-db46b7ac592f-kube-api-access-97kmg\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831041 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-config-data\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831076 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-combined-ca-bundle\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831112 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2wk\" (UniqueName: \"kubernetes.io/projected/9713cf71-536f-4674-8184-7c7651dad952-kube-api-access-rg2wk\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831167 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/edd9483b-68c8-4e5e-a562-db46b7ac592f-horizon-secret-key\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd9483b-68c8-4e5e-a562-db46b7ac592f-logs\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831251 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-config-data\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831275 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-scripts\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831303 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9713cf71-536f-4674-8184-7c7651dad952-logs\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.831331 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-scripts\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.832307 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.832467 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9713cf71-536f-4674-8184-7c7651dad952-logs\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.832523 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vttg" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.835401 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.857141 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.858883 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.863673 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.864102 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.866927 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-config-data\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.870987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-combined-ca-bundle\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.874487 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.886257 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2wk\" (UniqueName: \"kubernetes.io/projected/9713cf71-536f-4674-8184-7c7651dad952-kube-api-access-rg2wk\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.894213 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-scripts\") pod \"placement-db-sync-2wcrf\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934424 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934492 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b8c6\" (UniqueName: \"kubernetes.io/projected/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-kube-api-access-2b8c6\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934530 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-config-data\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934647 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-dns-svc\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934673 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/edd9483b-68c8-4e5e-a562-db46b7ac592f-horizon-secret-key\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934711 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934743 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs7mx\" (UniqueName: \"kubernetes.io/projected/a4a68d64-f644-4e4a-a216-af618d1883c8-kube-api-access-xs7mx\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934777 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934800 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-logs\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934852 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934872 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934913 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934945 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.934981 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd9483b-68c8-4e5e-a562-db46b7ac592f-logs\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935002 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935039 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935054 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935080 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-scripts\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-scripts\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935139 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsg8f\" (UniqueName: \"kubernetes.io/projected/3076c40d-fd20-4012-b09f-7a44a031ae59-kube-api-access-xsg8f\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935176 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97kmg\" (UniqueName: \"kubernetes.io/projected/edd9483b-68c8-4e5e-a562-db46b7ac592f-kube-api-access-97kmg\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935195 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-config-data\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.935223 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-config\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.936647 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-scripts\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.936665 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd9483b-68c8-4e5e-a562-db46b7ac592f-logs\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.949914 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/edd9483b-68c8-4e5e-a562-db46b7ac592f-horizon-secret-key\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.949814 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-config-data\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:33 crc kubenswrapper[4830]: I0311 09:34:33.952251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97kmg\" (UniqueName: \"kubernetes.io/projected/edd9483b-68c8-4e5e-a562-db46b7ac592f-kube-api-access-97kmg\") pod \"horizon-76bbdbbbb9-9p7rc\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.024609 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-r8pdd"] Mar 11 09:34:34 crc kubenswrapper[4830]: W0311 09:34:34.028008 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod252f99e4_09b2_4bd4_aeef_35f4d00b34c9.slice/crio-5dab8fc4492813451db84477231dc3731d17f4e136a27efc4d0342f43b7f0615 WatchSource:0}: Error finding container 5dab8fc4492813451db84477231dc3731d17f4e136a27efc4d0342f43b7f0615: Status 404 returned error can't find the container with id 5dab8fc4492813451db84477231dc3731d17f4e136a27efc4d0342f43b7f0615 Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041579 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-dns-svc\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041663 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041690 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs7mx\" (UniqueName: \"kubernetes.io/projected/a4a68d64-f644-4e4a-a216-af618d1883c8-kube-api-access-xs7mx\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041738 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041765 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-logs\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041789 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041810 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041833 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041853 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041876 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041915 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041956 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.041983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042062 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-scripts\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042107 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsg8f\" (UniqueName: \"kubernetes.io/projected/3076c40d-fd20-4012-b09f-7a44a031ae59-kube-api-access-xsg8f\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042163 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-config\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042191 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042223 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b8c6\" (UniqueName: \"kubernetes.io/projected/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-kube-api-access-2b8c6\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042249 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.042298 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-config-data\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.043490 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-dns-svc\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.045332 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.045351 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.047208 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.048895 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.050096 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.051057 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.051328 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.052032 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-sb\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.052100 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.052289 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-logs\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.055620 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.060780 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-config\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.063552 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-nb\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.067438 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b8c6\" (UniqueName: \"kubernetes.io/projected/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-kube-api-access-2b8c6\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.070679 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wcrf" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.072071 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-config-data\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.077831 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.084056 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.086932 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-scripts\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.092887 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.097547 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs7mx\" (UniqueName: \"kubernetes.io/projected/a4a68d64-f644-4e4a-a216-af618d1883c8-kube-api-access-xs7mx\") pod \"dnsmasq-dns-56798b757f-4m7pp\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.101294 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsg8f\" (UniqueName: \"kubernetes.io/projected/3076c40d-fd20-4012-b09f-7a44a031ae59-kube-api-access-xsg8f\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.124038 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.125609 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.200154 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.220850 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.237579 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.281075 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bdc7dd545-m2jlq"] Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.306611 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p2qw2"] Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.439376 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lqwp9"] Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.598262 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fwspb"] Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.638123 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.652280 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pl7d8"] Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.671935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" event={"ID":"252f99e4-09b2-4bd4-aeef-35f4d00b34c9","Type":"ContainerStarted","Data":"5dab8fc4492813451db84477231dc3731d17f4e136a27efc4d0342f43b7f0615"} Mar 11 09:34:34 crc kubenswrapper[4830]: W0311 09:34:34.685329 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod253897f0_4649_46c8_9bb3_9d25a4864701.slice/crio-0a300820ab826816cc64c23d1843313baca9f09ff15f8786c31435e87f203207 WatchSource:0}: Error finding container 0a300820ab826816cc64c23d1843313baca9f09ff15f8786c31435e87f203207: Status 404 returned error can't find the container with id 0a300820ab826816cc64c23d1843313baca9f09ff15f8786c31435e87f203207 Mar 11 09:34:34 crc kubenswrapper[4830]: W0311 09:34:34.704847 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc3adf05_3cb9_4fda_be48_67b6b3084179.slice/crio-6253530cd7ec576fa880a84675bc596e10ff1c23e2b88e95d5243d01305a9d37 WatchSource:0}: Error finding container 6253530cd7ec576fa880a84675bc596e10ff1c23e2b88e95d5243d01305a9d37: Status 404 returned error can't find the container with id 6253530cd7ec576fa880a84675bc596e10ff1c23e2b88e95d5243d01305a9d37 Mar 11 09:34:34 crc kubenswrapper[4830]: W0311 09:34:34.708049 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod045501ed_58bb_4a38_9b4a_5091217cf610.slice/crio-8767287839fccd93826ffb117034d90549292ebbb319c1bc3e130e845d9f1f1a WatchSource:0}: Error finding container 8767287839fccd93826ffb117034d90549292ebbb319c1bc3e130e845d9f1f1a: Status 404 returned error can't find the container with id 8767287839fccd93826ffb117034d90549292ebbb319c1bc3e130e845d9f1f1a Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.749424 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bbdbbbb9-9p7rc"] Mar 11 09:34:34 crc kubenswrapper[4830]: W0311 09:34:34.767476 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedd9483b_68c8_4e5e_a562_db46b7ac592f.slice/crio-cca47160e60bcb0faa6753a52ee1a545862a0a1af59942413c0f309b0dfeab3b WatchSource:0}: Error finding container cca47160e60bcb0faa6753a52ee1a545862a0a1af59942413c0f309b0dfeab3b: Status 404 returned error can't find the container with id cca47160e60bcb0faa6753a52ee1a545862a0a1af59942413c0f309b0dfeab3b Mar 11 09:34:34 crc kubenswrapper[4830]: I0311 09:34:34.851867 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2wcrf"] Mar 11 09:34:34 crc kubenswrapper[4830]: W0311 09:34:34.900228 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9713cf71_536f_4674_8184_7c7651dad952.slice/crio-3f370bf2dbe7bc49ab032f6808efdbe375ecc503d49c6ced440eeff0a08554d2 WatchSource:0}: Error finding container 3f370bf2dbe7bc49ab032f6808efdbe375ecc503d49c6ced440eeff0a08554d2: Status 404 returned error can't find the container with id 3f370bf2dbe7bc49ab032f6808efdbe375ecc503d49c6ced440eeff0a08554d2 Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.120310 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bdc7dd545-m2jlq"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.180572 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.198535 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74675f7bbf-5c9lf"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.200345 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.218942 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74675f7bbf-5c9lf"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.273444 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-horizon-secret-key\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.273499 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-config-data\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.273525 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jppmn\" (UniqueName: \"kubernetes.io/projected/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-kube-api-access-jppmn\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.273577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-logs\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.273700 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-scripts\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.343854 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.376505 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-scripts\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.376557 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-horizon-secret-key\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.376596 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-config-data\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.376621 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jppmn\" (UniqueName: \"kubernetes.io/projected/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-kube-api-access-jppmn\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.376672 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-logs\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.377336 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-logs\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.378148 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-config-data\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.378915 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-scripts\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.384528 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.390808 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-horizon-secret-key\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.395297 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.403217 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jppmn\" (UniqueName: \"kubernetes.io/projected/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-kube-api-access-jppmn\") pod \"horizon-74675f7bbf-5c9lf\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.494005 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-4m7pp"] Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.559792 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.747795 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p2qw2" event={"ID":"4a2c321a-3e97-44a9-b624-f87d2df01b9c","Type":"ContainerStarted","Data":"154a84252a8c8c643706a35abf7adbec751e0ec395aa5f2f14476ddc0117fdca"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.747851 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p2qw2" event={"ID":"4a2c321a-3e97-44a9-b624-f87d2df01b9c","Type":"ContainerStarted","Data":"6a255ebdcfca87c209f542010340706ed03031edba9e0c286aa78053dd0de637"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.765990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wcrf" event={"ID":"9713cf71-536f-4674-8184-7c7651dad952","Type":"ContainerStarted","Data":"3f370bf2dbe7bc49ab032f6808efdbe375ecc503d49c6ced440eeff0a08554d2"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.781598 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lqwp9" event={"ID":"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab","Type":"ContainerStarted","Data":"d71895e5b9d5280b4ebb7c4ffd9eb57c474d362943489245698be9c2012895f5"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.807883 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p2qw2" podStartSLOduration=3.807865495 podStartE2EDuration="3.807865495s" podCreationTimestamp="2026-03-11 09:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:35.79797648 +0000 UTC m=+1243.579127179" watchObservedRunningTime="2026-03-11 09:34:35.807865495 +0000 UTC m=+1243.589016184" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.815136 4830 generic.go:334] "Generic (PLEG): container finished" podID="252f99e4-09b2-4bd4-aeef-35f4d00b34c9" containerID="888b163fc6ada571569d8075655ffd5b4a930cd7fde91c71b80200a1baec9811" exitCode=0 Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.815238 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" event={"ID":"252f99e4-09b2-4bd4-aeef-35f4d00b34c9","Type":"ContainerDied","Data":"888b163fc6ada571569d8075655ffd5b4a930cd7fde91c71b80200a1baec9811"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.840880 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fwspb" event={"ID":"253897f0-4649-46c8-9bb3-9d25a4864701","Type":"ContainerStarted","Data":"1cf30dcfc616d5220609e5b56d96126ad66cb4802eee570f8d75ef7e0185826e"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.840926 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fwspb" event={"ID":"253897f0-4649-46c8-9bb3-9d25a4864701","Type":"ContainerStarted","Data":"0a300820ab826816cc64c23d1843313baca9f09ff15f8786c31435e87f203207"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.842565 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3076c40d-fd20-4012-b09f-7a44a031ae59","Type":"ContainerStarted","Data":"04f0742a58d1a6c5d1aab9a68f42a8f216862b5cc45ee1a446a08361162ee089"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.843984 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerStarted","Data":"6253530cd7ec576fa880a84675bc596e10ff1c23e2b88e95d5243d01305a9d37"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.855808 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bbdbbbb9-9p7rc" event={"ID":"edd9483b-68c8-4e5e-a562-db46b7ac592f","Type":"ContainerStarted","Data":"cca47160e60bcb0faa6753a52ee1a545862a0a1af59942413c0f309b0dfeab3b"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.902004 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fwspb" podStartSLOduration=2.901988656 podStartE2EDuration="2.901988656s" podCreationTimestamp="2026-03-11 09:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:35.901321877 +0000 UTC m=+1243.682472566" watchObservedRunningTime="2026-03-11 09:34:35.901988656 +0000 UTC m=+1243.683139345" Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.940301 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"5e4f3a37a6c0d9aea7c8b1be67ed9bdfcf8f97081e811d352dfedc35182ec7b5"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.957235 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" event={"ID":"a4a68d64-f644-4e4a-a216-af618d1883c8","Type":"ContainerStarted","Data":"0ccdde2761a90b9ccc3eb226cf50fc64f4f3a2feb6aca61ecf1ce54e12f66fd9"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.965487 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl7d8" event={"ID":"045501ed-58bb-4a38-9b4a-5091217cf610","Type":"ContainerStarted","Data":"8767287839fccd93826ffb117034d90549292ebbb319c1bc3e130e845d9f1f1a"} Mar 11 09:34:35 crc kubenswrapper[4830]: I0311 09:34:35.974951 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bdc7dd545-m2jlq" event={"ID":"c1e60a5c-559c-4667-adf4-14e8e7066569","Type":"ContainerStarted","Data":"dc374e3972104b5cc4a5a50f2fd45a69042be4000948dee7ceebcea3cdf35b99"} Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.008373 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:34:36 crc kubenswrapper[4830]: W0311 09:34:36.095247 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bbe2736_f9c5_46d4_be6e_ed4f9909c2b8.slice/crio-4fec03acd2aebf9dbedeaf868be72acb77e296f500d86a91ed905fc15c194006 WatchSource:0}: Error finding container 4fec03acd2aebf9dbedeaf868be72acb77e296f500d86a91ed905fc15c194006: Status 404 returned error can't find the container with id 4fec03acd2aebf9dbedeaf868be72acb77e296f500d86a91ed905fc15c194006 Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.460206 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-4hkw2" podUID="6d65cbea-4a39-478f-91b1-60e6dd72b135" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.586977 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.714870 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74675f7bbf-5c9lf"] Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.728324 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-config\") pod \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.728796 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-sb\") pod \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.728824 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-nb\") pod \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.728901 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8xzm\" (UniqueName: \"kubernetes.io/projected/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-kube-api-access-g8xzm\") pod \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.728933 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-dns-svc\") pod \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\" (UID: \"252f99e4-09b2-4bd4-aeef-35f4d00b34c9\") " Mar 11 09:34:36 crc kubenswrapper[4830]: W0311 09:34:36.741756 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f1f2c2d_80c5_4859_9cbd_1fc0732fcabc.slice/crio-d43426154ae4d0c6cf438c414c1446abba3f284ac0e3720272bf353de5abf9e5 WatchSource:0}: Error finding container d43426154ae4d0c6cf438c414c1446abba3f284ac0e3720272bf353de5abf9e5: Status 404 returned error can't find the container with id d43426154ae4d0c6cf438c414c1446abba3f284ac0e3720272bf353de5abf9e5 Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.745450 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-kube-api-access-g8xzm" (OuterVolumeSpecName: "kube-api-access-g8xzm") pod "252f99e4-09b2-4bd4-aeef-35f4d00b34c9" (UID: "252f99e4-09b2-4bd4-aeef-35f4d00b34c9"). InnerVolumeSpecName "kube-api-access-g8xzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.771164 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-config" (OuterVolumeSpecName: "config") pod "252f99e4-09b2-4bd4-aeef-35f4d00b34c9" (UID: "252f99e4-09b2-4bd4-aeef-35f4d00b34c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.772344 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "252f99e4-09b2-4bd4-aeef-35f4d00b34c9" (UID: "252f99e4-09b2-4bd4-aeef-35f4d00b34c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.773059 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "252f99e4-09b2-4bd4-aeef-35f4d00b34c9" (UID: "252f99e4-09b2-4bd4-aeef-35f4d00b34c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.780706 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "252f99e4-09b2-4bd4-aeef-35f4d00b34c9" (UID: "252f99e4-09b2-4bd4-aeef-35f4d00b34c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.830507 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.830778 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.830845 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.830903 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8xzm\" (UniqueName: \"kubernetes.io/projected/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-kube-api-access-g8xzm\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:36 crc kubenswrapper[4830]: I0311 09:34:36.830959 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/252f99e4-09b2-4bd4-aeef-35f4d00b34c9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.010289 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8","Type":"ContainerStarted","Data":"4fec03acd2aebf9dbedeaf868be72acb77e296f500d86a91ed905fc15c194006"} Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.036934 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"f7155e936c46953c9a17f158fb1d04e5b1c8c5697e4a80b03a86b931d8164b3d"} Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.036976 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"259072b41f4f66743f73f5d3c24f0fd398ceecd924eb9c8a8ca8f106acbd4b25"} Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.039205 4830 generic.go:334] "Generic (PLEG): container finished" podID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerID="77e2aaaf0900d73db61d6bf3dd9eaa6395415af722d9b32933901404a2116700" exitCode=0 Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.039252 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" event={"ID":"a4a68d64-f644-4e4a-a216-af618d1883c8","Type":"ContainerDied","Data":"77e2aaaf0900d73db61d6bf3dd9eaa6395415af722d9b32933901404a2116700"} Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.041148 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3076c40d-fd20-4012-b09f-7a44a031ae59","Type":"ContainerStarted","Data":"fbdedc2e00c9155833f491dc70f345507857db496bde9aa0567ff5f0cc4e06ca"} Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.042107 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74675f7bbf-5c9lf" event={"ID":"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc","Type":"ContainerStarted","Data":"d43426154ae4d0c6cf438c414c1446abba3f284ac0e3720272bf353de5abf9e5"} Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.044256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" event={"ID":"252f99e4-09b2-4bd4-aeef-35f4d00b34c9","Type":"ContainerDied","Data":"5dab8fc4492813451db84477231dc3731d17f4e136a27efc4d0342f43b7f0615"} Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.044323 4830 scope.go:117] "RemoveContainer" containerID="888b163fc6ada571569d8075655ffd5b4a930cd7fde91c71b80200a1baec9811" Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.044309 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5679f497-r8pdd" Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.157072 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-r8pdd"] Mar 11 09:34:37 crc kubenswrapper[4830]: I0311 09:34:37.171210 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5679f497-r8pdd"] Mar 11 09:34:38 crc kubenswrapper[4830]: I0311 09:34:38.068545 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"4f69fcacd610bdb4678cd656670a64f7feb0a73969ef844834bded2bd2e76dce"} Mar 11 09:34:38 crc kubenswrapper[4830]: I0311 09:34:38.954345 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="252f99e4-09b2-4bd4-aeef-35f4d00b34c9" path="/var/lib/kubelet/pods/252f99e4-09b2-4bd4-aeef-35f4d00b34c9/volumes" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.097114 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8","Type":"ContainerStarted","Data":"ee9080de41c880e997ed147ebad7dc87ddbe02e0a39e92ce73c0c86d47080c60"} Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.106646 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"93fa6ca1ff54431ca3ec13e042fa2aeb70f94561c3aecce449415b798f078413"} Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.108990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" event={"ID":"a4a68d64-f644-4e4a-a216-af618d1883c8","Type":"ContainerStarted","Data":"9090e448f5755db665471c0ce3878c3de13bad11806387d0acdad4a261dcab75"} Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.109171 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.113059 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3076c40d-fd20-4012-b09f-7a44a031ae59","Type":"ContainerStarted","Data":"e447dbd80eec7f016f80641594c602759cce2b8faac5bfdec3fa8860cd7ea29c"} Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.113215 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-log" containerID="cri-o://fbdedc2e00c9155833f491dc70f345507857db496bde9aa0567ff5f0cc4e06ca" gracePeriod=30 Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.113236 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-httpd" containerID="cri-o://e447dbd80eec7f016f80641594c602759cce2b8faac5bfdec3fa8860cd7ea29c" gracePeriod=30 Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.136756 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" podStartSLOduration=8.136743439 podStartE2EDuration="8.136743439s" podCreationTimestamp="2026-03-11 09:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:41.135306389 +0000 UTC m=+1248.916457078" watchObservedRunningTime="2026-03-11 09:34:41.136743439 +0000 UTC m=+1248.917894128" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.169557 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.169541779 podStartE2EDuration="8.169541779s" podCreationTimestamp="2026-03-11 09:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:41.169054635 +0000 UTC m=+1248.950205334" watchObservedRunningTime="2026-03-11 09:34:41.169541779 +0000 UTC m=+1248.950692468" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.643216 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bbdbbbb9-9p7rc"] Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.669358 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f6b87df74-q5t2v"] Mar 11 09:34:41 crc kubenswrapper[4830]: E0311 09:34:41.669813 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252f99e4-09b2-4bd4-aeef-35f4d00b34c9" containerName="init" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.669828 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="252f99e4-09b2-4bd4-aeef-35f4d00b34c9" containerName="init" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.670143 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="252f99e4-09b2-4bd4-aeef-35f4d00b34c9" containerName="init" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.671385 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.674192 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.685876 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f6b87df74-q5t2v"] Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.733393 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlwps\" (UniqueName: \"kubernetes.io/projected/242c5a27-bc92-42f0-b630-6d1f3cd55822-kube-api-access-rlwps\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.733502 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-combined-ca-bundle\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.733572 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-scripts\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.733633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-secret-key\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.733669 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-tls-certs\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.733747 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-config-data\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.733783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242c5a27-bc92-42f0-b630-6d1f3cd55822-logs\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.828863 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74675f7bbf-5c9lf"] Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.840106 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-scripts\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.840181 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-secret-key\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.840219 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-tls-certs\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.840271 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-config-data\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.840305 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242c5a27-bc92-42f0-b630-6d1f3cd55822-logs\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.840351 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlwps\" (UniqueName: \"kubernetes.io/projected/242c5a27-bc92-42f0-b630-6d1f3cd55822-kube-api-access-rlwps\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.840432 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-combined-ca-bundle\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.842419 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242c5a27-bc92-42f0-b630-6d1f3cd55822-logs\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.844060 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-scripts\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.844758 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-config-data\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.846695 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-combined-ca-bundle\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.850017 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-tls-certs\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.866624 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-secret-key\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.872636 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlwps\" (UniqueName: \"kubernetes.io/projected/242c5a27-bc92-42f0-b630-6d1f3cd55822-kube-api-access-rlwps\") pod \"horizon-5f6b87df74-q5t2v\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.946875 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-789dc4b6cd-xz7ds"] Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.948410 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:41 crc kubenswrapper[4830]: I0311 09:34:41.974214 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-789dc4b6cd-xz7ds"] Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.009569 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.046103 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-horizon-secret-key\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.046173 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-combined-ca-bundle\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.046465 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdk9\" (UniqueName: \"kubernetes.io/projected/77e86c78-b565-4e6c-8867-519fa2d5137a-kube-api-access-hrdk9\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.046546 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77e86c78-b565-4e6c-8867-519fa2d5137a-config-data\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.046638 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e86c78-b565-4e6c-8867-519fa2d5137a-scripts\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.046654 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77e86c78-b565-4e6c-8867-519fa2d5137a-logs\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.046713 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-horizon-tls-certs\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.127386 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8","Type":"ContainerStarted","Data":"268e9b643795281863663820770ee1e949311d0d5a87d80799fbc19bc2b6c320"} Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.127545 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-log" containerID="cri-o://ee9080de41c880e997ed147ebad7dc87ddbe02e0a39e92ce73c0c86d47080c60" gracePeriod=30 Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.128022 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-httpd" containerID="cri-o://268e9b643795281863663820770ee1e949311d0d5a87d80799fbc19bc2b6c320" gracePeriod=30 Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.137414 4830 generic.go:334] "Generic (PLEG): container finished" podID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerID="e447dbd80eec7f016f80641594c602759cce2b8faac5bfdec3fa8860cd7ea29c" exitCode=0 Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.137497 4830 generic.go:334] "Generic (PLEG): container finished" podID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerID="fbdedc2e00c9155833f491dc70f345507857db496bde9aa0567ff5f0cc4e06ca" exitCode=143 Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.138555 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3076c40d-fd20-4012-b09f-7a44a031ae59","Type":"ContainerDied","Data":"e447dbd80eec7f016f80641594c602759cce2b8faac5bfdec3fa8860cd7ea29c"} Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.138597 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3076c40d-fd20-4012-b09f-7a44a031ae59","Type":"ContainerDied","Data":"fbdedc2e00c9155833f491dc70f345507857db496bde9aa0567ff5f0cc4e06ca"} Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.147680 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdk9\" (UniqueName: \"kubernetes.io/projected/77e86c78-b565-4e6c-8867-519fa2d5137a-kube-api-access-hrdk9\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.147748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77e86c78-b565-4e6c-8867-519fa2d5137a-config-data\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.147792 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e86c78-b565-4e6c-8867-519fa2d5137a-scripts\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.147808 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77e86c78-b565-4e6c-8867-519fa2d5137a-logs\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.147848 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-horizon-tls-certs\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.147902 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-horizon-secret-key\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.147922 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-combined-ca-bundle\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.160349 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e86c78-b565-4e6c-8867-519fa2d5137a-scripts\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.160606 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77e86c78-b565-4e6c-8867-519fa2d5137a-logs\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.161818 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-horizon-tls-certs\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.186965 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdk9\" (UniqueName: \"kubernetes.io/projected/77e86c78-b565-4e6c-8867-519fa2d5137a-kube-api-access-hrdk9\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.190295 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.190271921 podStartE2EDuration="9.190271921s" podCreationTimestamp="2026-03-11 09:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:34:42.173311171 +0000 UTC m=+1249.954461870" watchObservedRunningTime="2026-03-11 09:34:42.190271921 +0000 UTC m=+1249.971422610" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.192941 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-combined-ca-bundle\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.196444 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77e86c78-b565-4e6c-8867-519fa2d5137a-config-data\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.201429 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/77e86c78-b565-4e6c-8867-519fa2d5137a-horizon-secret-key\") pod \"horizon-789dc4b6cd-xz7ds\" (UID: \"77e86c78-b565-4e6c-8867-519fa2d5137a\") " pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:42 crc kubenswrapper[4830]: I0311 09:34:42.336878 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:34:43 crc kubenswrapper[4830]: I0311 09:34:43.167895 4830 generic.go:334] "Generic (PLEG): container finished" podID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerID="268e9b643795281863663820770ee1e949311d0d5a87d80799fbc19bc2b6c320" exitCode=143 Mar 11 09:34:43 crc kubenswrapper[4830]: I0311 09:34:43.167940 4830 generic.go:334] "Generic (PLEG): container finished" podID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerID="ee9080de41c880e997ed147ebad7dc87ddbe02e0a39e92ce73c0c86d47080c60" exitCode=143 Mar 11 09:34:43 crc kubenswrapper[4830]: I0311 09:34:43.167964 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8","Type":"ContainerDied","Data":"268e9b643795281863663820770ee1e949311d0d5a87d80799fbc19bc2b6c320"} Mar 11 09:34:43 crc kubenswrapper[4830]: I0311 09:34:43.167993 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8","Type":"ContainerDied","Data":"ee9080de41c880e997ed147ebad7dc87ddbe02e0a39e92ce73c0c86d47080c60"} Mar 11 09:34:47 crc kubenswrapper[4830]: I0311 09:34:47.207987 4830 generic.go:334] "Generic (PLEG): container finished" podID="4a2c321a-3e97-44a9-b624-f87d2df01b9c" containerID="154a84252a8c8c643706a35abf7adbec751e0ec395aa5f2f14476ddc0117fdca" exitCode=0 Mar 11 09:34:47 crc kubenswrapper[4830]: I0311 09:34:47.208102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p2qw2" event={"ID":"4a2c321a-3e97-44a9-b624-f87d2df01b9c","Type":"ContainerDied","Data":"154a84252a8c8c643706a35abf7adbec751e0ec395aa5f2f14476ddc0117fdca"} Mar 11 09:34:49 crc kubenswrapper[4830]: I0311 09:34:49.202526 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:34:49 crc kubenswrapper[4830]: I0311 09:34:49.252680 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jxw5l"] Mar 11 09:34:49 crc kubenswrapper[4830]: I0311 09:34:49.252919 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="dnsmasq-dns" containerID="cri-o://165f49ac949ccec8e4ec006d3a7832a727b81f47d10d2c1d8969e1b392ea9bd6" gracePeriod=10 Mar 11 09:34:49 crc kubenswrapper[4830]: E0311 09:34:49.932558 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 11 09:34:49 crc kubenswrapper[4830]: E0311 09:34:49.933007 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n579h8h89hf8h56ch4hcfh5b7h577h546h66bh55h56ch96h54dh555h5fch599h65ch58fhd4h687h56bh599h84h55h68ch55ch5bdh549h595h5fcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-97kmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-76bbdbbbb9-9p7rc_openstack(edd9483b-68c8-4e5e-a562-db46b7ac592f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:34:49 crc kubenswrapper[4830]: E0311 09:34:49.946011 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-76bbdbbbb9-9p7rc" podUID="edd9483b-68c8-4e5e-a562-db46b7ac592f" Mar 11 09:34:49 crc kubenswrapper[4830]: I0311 09:34:49.995304 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.024909 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l468\" (UniqueName: \"kubernetes.io/projected/4a2c321a-3e97-44a9-b624-f87d2df01b9c-kube-api-access-9l468\") pod \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.025113 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-combined-ca-bundle\") pod \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.025143 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-fernet-keys\") pod \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.025188 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-scripts\") pod \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.025218 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-credential-keys\") pod \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.026196 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-config-data\") pod \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\" (UID: \"4a2c321a-3e97-44a9-b624-f87d2df01b9c\") " Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.031416 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4a2c321a-3e97-44a9-b624-f87d2df01b9c" (UID: "4a2c321a-3e97-44a9-b624-f87d2df01b9c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.034158 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4a2c321a-3e97-44a9-b624-f87d2df01b9c" (UID: "4a2c321a-3e97-44a9-b624-f87d2df01b9c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.036741 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-scripts" (OuterVolumeSpecName: "scripts") pod "4a2c321a-3e97-44a9-b624-f87d2df01b9c" (UID: "4a2c321a-3e97-44a9-b624-f87d2df01b9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.039396 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2c321a-3e97-44a9-b624-f87d2df01b9c-kube-api-access-9l468" (OuterVolumeSpecName: "kube-api-access-9l468") pod "4a2c321a-3e97-44a9-b624-f87d2df01b9c" (UID: "4a2c321a-3e97-44a9-b624-f87d2df01b9c"). InnerVolumeSpecName "kube-api-access-9l468". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.062352 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-config-data" (OuterVolumeSpecName: "config-data") pod "4a2c321a-3e97-44a9-b624-f87d2df01b9c" (UID: "4a2c321a-3e97-44a9-b624-f87d2df01b9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.067687 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a2c321a-3e97-44a9-b624-f87d2df01b9c" (UID: "4a2c321a-3e97-44a9-b624-f87d2df01b9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.127740 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.128183 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.128248 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.128263 4830 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.128275 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a2c321a-3e97-44a9-b624-f87d2df01b9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.128286 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l468\" (UniqueName: \"kubernetes.io/projected/4a2c321a-3e97-44a9-b624-f87d2df01b9c-kube-api-access-9l468\") on node \"crc\" DevicePath \"\"" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.236996 4830 generic.go:334] "Generic (PLEG): container finished" podID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerID="165f49ac949ccec8e4ec006d3a7832a727b81f47d10d2c1d8969e1b392ea9bd6" exitCode=0 Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.237085 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" event={"ID":"aad4e154-c667-41c1-9bf1-bf53a07a15b1","Type":"ContainerDied","Data":"165f49ac949ccec8e4ec006d3a7832a727b81f47d10d2c1d8969e1b392ea9bd6"} Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.238909 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p2qw2" Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.245331 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p2qw2" event={"ID":"4a2c321a-3e97-44a9-b624-f87d2df01b9c","Type":"ContainerDied","Data":"6a255ebdcfca87c209f542010340706ed03031edba9e0c286aa78053dd0de637"} Mar 11 09:34:50 crc kubenswrapper[4830]: I0311 09:34:50.245541 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a255ebdcfca87c209f542010340706ed03031edba9e0c286aa78053dd0de637" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.000252 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.163377 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-p2qw2"] Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.171156 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-p2qw2"] Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.269345 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p47hl"] Mar 11 09:34:51 crc kubenswrapper[4830]: E0311 09:34:51.270031 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2c321a-3e97-44a9-b624-f87d2df01b9c" containerName="keystone-bootstrap" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.270054 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2c321a-3e97-44a9-b624-f87d2df01b9c" containerName="keystone-bootstrap" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.270389 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2c321a-3e97-44a9-b624-f87d2df01b9c" containerName="keystone-bootstrap" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.271093 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.273142 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.273283 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q9l9c" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.273454 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.273990 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.275412 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.294538 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p47hl"] Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.447657 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-combined-ca-bundle\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.447844 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-fernet-keys\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.447907 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-scripts\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.448004 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-credential-keys\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.448084 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-config-data\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.448310 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29c95\" (UniqueName: \"kubernetes.io/projected/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-kube-api-access-29c95\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.550326 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-credential-keys\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.550670 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-config-data\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.550972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29c95\" (UniqueName: \"kubernetes.io/projected/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-kube-api-access-29c95\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.551147 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-combined-ca-bundle\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.551296 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-fernet-keys\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.551504 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-scripts\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.555575 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-combined-ca-bundle\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.557853 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-credential-keys\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.558189 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-scripts\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.558558 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-config-data\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.560320 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-fernet-keys\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.566002 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29c95\" (UniqueName: \"kubernetes.io/projected/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-kube-api-access-29c95\") pod \"keystone-bootstrap-p47hl\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:51 crc kubenswrapper[4830]: I0311 09:34:51.605384 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:34:52 crc kubenswrapper[4830]: I0311 09:34:52.969308 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2c321a-3e97-44a9-b624-f87d2df01b9c" path="/var/lib/kubelet/pods/4a2c321a-3e97-44a9-b624-f87d2df01b9c/volumes" Mar 11 09:34:56 crc kubenswrapper[4830]: I0311 09:34:56.001212 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Mar 11 09:34:56 crc kubenswrapper[4830]: I0311 09:34:56.615097 4830 scope.go:117] "RemoveContainer" containerID="63f3033d56819f54f0274d7284469e7705ea7356b5fd01d127ca7a394c42cefc" Mar 11 09:34:58 crc kubenswrapper[4830]: E0311 09:34:58.171722 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 11 09:34:58 crc kubenswrapper[4830]: E0311 09:34:58.172174 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n576h654h578h58dh58h75h68hf9h5c5h547h5d6h55dh5f8h658hbfh586h5h57bhf7h5c5h5dch686h585h546h89hbh579h9bh8fhf6h5fdh685q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cflv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6bdc7dd545-m2jlq_openstack(c1e60a5c-559c-4667-adf4-14e8e7066569): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:34:58 crc kubenswrapper[4830]: E0311 09:34:58.174433 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6bdc7dd545-m2jlq" podUID="c1e60a5c-559c-4667-adf4-14e8e7066569" Mar 11 09:35:01 crc kubenswrapper[4830]: E0311 09:35:01.532254 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 11 09:35:01 crc kubenswrapper[4830]: E0311 09:35:01.532968 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndh5cbh645h9h569h65ch548hd7hdch556h588h57dh54dh5c5hf6h68fh6dh59h68fh667h596h97h555h687h58h85h99h65ch5fdh67fh555h656q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jppmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-74675f7bbf-5c9lf_openstack(7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:35:01 crc kubenswrapper[4830]: E0311 09:35:01.536303 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-74675f7bbf-5c9lf" podUID="7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.272657 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.305251 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd9483b-68c8-4e5e-a562-db46b7ac592f-logs\") pod \"edd9483b-68c8-4e5e-a562-db46b7ac592f\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.305344 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/edd9483b-68c8-4e5e-a562-db46b7ac592f-horizon-secret-key\") pod \"edd9483b-68c8-4e5e-a562-db46b7ac592f\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.305372 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-scripts\") pod \"edd9483b-68c8-4e5e-a562-db46b7ac592f\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.305476 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97kmg\" (UniqueName: \"kubernetes.io/projected/edd9483b-68c8-4e5e-a562-db46b7ac592f-kube-api-access-97kmg\") pod \"edd9483b-68c8-4e5e-a562-db46b7ac592f\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.305561 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd9483b-68c8-4e5e-a562-db46b7ac592f-logs" (OuterVolumeSpecName: "logs") pod "edd9483b-68c8-4e5e-a562-db46b7ac592f" (UID: "edd9483b-68c8-4e5e-a562-db46b7ac592f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.305624 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-config-data\") pod \"edd9483b-68c8-4e5e-a562-db46b7ac592f\" (UID: \"edd9483b-68c8-4e5e-a562-db46b7ac592f\") " Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.305918 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-scripts" (OuterVolumeSpecName: "scripts") pod "edd9483b-68c8-4e5e-a562-db46b7ac592f" (UID: "edd9483b-68c8-4e5e-a562-db46b7ac592f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.306212 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-config-data" (OuterVolumeSpecName: "config-data") pod "edd9483b-68c8-4e5e-a562-db46b7ac592f" (UID: "edd9483b-68c8-4e5e-a562-db46b7ac592f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.306453 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.306478 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edd9483b-68c8-4e5e-a562-db46b7ac592f-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.306488 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/edd9483b-68c8-4e5e-a562-db46b7ac592f-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.311903 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd9483b-68c8-4e5e-a562-db46b7ac592f-kube-api-access-97kmg" (OuterVolumeSpecName: "kube-api-access-97kmg") pod "edd9483b-68c8-4e5e-a562-db46b7ac592f" (UID: "edd9483b-68c8-4e5e-a562-db46b7ac592f"). InnerVolumeSpecName "kube-api-access-97kmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.312357 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd9483b-68c8-4e5e-a562-db46b7ac592f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "edd9483b-68c8-4e5e-a562-db46b7ac592f" (UID: "edd9483b-68c8-4e5e-a562-db46b7ac592f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.390583 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bbdbbbb9-9p7rc" event={"ID":"edd9483b-68c8-4e5e-a562-db46b7ac592f","Type":"ContainerDied","Data":"cca47160e60bcb0faa6753a52ee1a545862a0a1af59942413c0f309b0dfeab3b"} Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.390632 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bbdbbbb9-9p7rc" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.407841 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97kmg\" (UniqueName: \"kubernetes.io/projected/edd9483b-68c8-4e5e-a562-db46b7ac592f-kube-api-access-97kmg\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.407870 4830 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/edd9483b-68c8-4e5e-a562-db46b7ac592f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.447142 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bbdbbbb9-9p7rc"] Mar 11 09:35:03 crc kubenswrapper[4830]: I0311 09:35:03.455454 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76bbdbbbb9-9p7rc"] Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.224195 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.224279 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.238100 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.238172 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 09:35:04 crc kubenswrapper[4830]: E0311 09:35:04.660113 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 11 09:35:04 crc kubenswrapper[4830]: E0311 09:35:04.660539 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bvd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lqwp9_openstack(fffcbc2c-4845-4a9d-8709-45eb4a28f0ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:35:04 crc kubenswrapper[4830]: E0311 09:35:04.662095 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lqwp9" podUID="fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.749830 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.757572 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.774704 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.840955 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-internal-tls-certs\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841059 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841161 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-sb\") pod \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841215 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e60a5c-559c-4667-adf4-14e8e7066569-logs\") pod \"c1e60a5c-559c-4667-adf4-14e8e7066569\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841305 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5977\" (UniqueName: \"kubernetes.io/projected/aad4e154-c667-41c1-9bf1-bf53a07a15b1-kube-api-access-t5977\") pod \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841378 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-config-data\") pod \"c1e60a5c-559c-4667-adf4-14e8e7066569\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841409 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c1e60a5c-559c-4667-adf4-14e8e7066569-horizon-secret-key\") pod \"c1e60a5c-559c-4667-adf4-14e8e7066569\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841467 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b8c6\" (UniqueName: \"kubernetes.io/projected/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-kube-api-access-2b8c6\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841545 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-scripts\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841771 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-config\") pod \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841810 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-scripts\") pod \"c1e60a5c-559c-4667-adf4-14e8e7066569\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841887 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-combined-ca-bundle\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.841954 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-logs\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.842046 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-dns-svc\") pod \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.842127 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-config-data\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.842204 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cflv6\" (UniqueName: \"kubernetes.io/projected/c1e60a5c-559c-4667-adf4-14e8e7066569-kube-api-access-cflv6\") pod \"c1e60a5c-559c-4667-adf4-14e8e7066569\" (UID: \"c1e60a5c-559c-4667-adf4-14e8e7066569\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.842326 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-nb\") pod \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\" (UID: \"aad4e154-c667-41c1-9bf1-bf53a07a15b1\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.842375 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-httpd-run\") pod \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\" (UID: \"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8\") " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.857607 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1e60a5c-559c-4667-adf4-14e8e7066569-logs" (OuterVolumeSpecName: "logs") pod "c1e60a5c-559c-4667-adf4-14e8e7066569" (UID: "c1e60a5c-559c-4667-adf4-14e8e7066569"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.859260 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-config-data" (OuterVolumeSpecName: "config-data") pod "c1e60a5c-559c-4667-adf4-14e8e7066569" (UID: "c1e60a5c-559c-4667-adf4-14e8e7066569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.859661 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.859990 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-scripts" (OuterVolumeSpecName: "scripts") pod "c1e60a5c-559c-4667-adf4-14e8e7066569" (UID: "c1e60a5c-559c-4667-adf4-14e8e7066569"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.861819 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-logs" (OuterVolumeSpecName: "logs") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.863445 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad4e154-c667-41c1-9bf1-bf53a07a15b1-kube-api-access-t5977" (OuterVolumeSpecName: "kube-api-access-t5977") pod "aad4e154-c667-41c1-9bf1-bf53a07a15b1" (UID: "aad4e154-c667-41c1-9bf1-bf53a07a15b1"). InnerVolumeSpecName "kube-api-access-t5977". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.864927 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.871511 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-scripts" (OuterVolumeSpecName: "scripts") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.890864 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e60a5c-559c-4667-adf4-14e8e7066569-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c1e60a5c-559c-4667-adf4-14e8e7066569" (UID: "c1e60a5c-559c-4667-adf4-14e8e7066569"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.895501 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e60a5c-559c-4667-adf4-14e8e7066569-kube-api-access-cflv6" (OuterVolumeSpecName: "kube-api-access-cflv6") pod "c1e60a5c-559c-4667-adf4-14e8e7066569" (UID: "c1e60a5c-559c-4667-adf4-14e8e7066569"). InnerVolumeSpecName "kube-api-access-cflv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.895696 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-kube-api-access-2b8c6" (OuterVolumeSpecName: "kube-api-access-2b8c6") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "kube-api-access-2b8c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.900919 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.904679 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aad4e154-c667-41c1-9bf1-bf53a07a15b1" (UID: "aad4e154-c667-41c1-9bf1-bf53a07a15b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.913303 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-config" (OuterVolumeSpecName: "config") pod "aad4e154-c667-41c1-9bf1-bf53a07a15b1" (UID: "aad4e154-c667-41c1-9bf1-bf53a07a15b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.919685 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.926872 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-config-data" (OuterVolumeSpecName: "config-data") pod "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" (UID: "9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.929648 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aad4e154-c667-41c1-9bf1-bf53a07a15b1" (UID: "aad4e154-c667-41c1-9bf1-bf53a07a15b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.930428 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aad4e154-c667-41c1-9bf1-bf53a07a15b1" (UID: "aad4e154-c667-41c1-9bf1-bf53a07a15b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947095 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947140 4830 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c1e60a5c-559c-4667-adf4-14e8e7066569-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947152 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b8c6\" (UniqueName: \"kubernetes.io/projected/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-kube-api-access-2b8c6\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947163 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947420 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947438 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e60a5c-559c-4667-adf4-14e8e7066569-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947448 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947458 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947467 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947476 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947484 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cflv6\" (UniqueName: \"kubernetes.io/projected/c1e60a5c-559c-4667-adf4-14e8e7066569-kube-api-access-cflv6\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947492 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947500 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947508 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947509 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd9483b-68c8-4e5e-a562-db46b7ac592f" path="/var/lib/kubelet/pods/edd9483b-68c8-4e5e-a562-db46b7ac592f/volumes" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947537 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947549 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aad4e154-c667-41c1-9bf1-bf53a07a15b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947560 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1e60a5c-559c-4667-adf4-14e8e7066569-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.947570 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5977\" (UniqueName: \"kubernetes.io/projected/aad4e154-c667-41c1-9bf1-bf53a07a15b1-kube-api-access-t5977\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:04 crc kubenswrapper[4830]: I0311 09:35:04.968404 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.048640 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.410244 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" event={"ID":"aad4e154-c667-41c1-9bf1-bf53a07a15b1","Type":"ContainerDied","Data":"9c8894f7c9b6918c2e123f8d8d1fb556e043723dc789271b86c9b9875e8cf7fa"} Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.410315 4830 scope.go:117] "RemoveContainer" containerID="165f49ac949ccec8e4ec006d3a7832a727b81f47d10d2c1d8969e1b392ea9bd6" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.410268 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.414585 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8","Type":"ContainerDied","Data":"4fec03acd2aebf9dbedeaf868be72acb77e296f500d86a91ed905fc15c194006"} Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.414638 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.420453 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bdc7dd545-m2jlq" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.420852 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bdc7dd545-m2jlq" event={"ID":"c1e60a5c-559c-4667-adf4-14e8e7066569","Type":"ContainerDied","Data":"dc374e3972104b5cc4a5a50f2fd45a69042be4000948dee7ceebcea3cdf35b99"} Mar 11 09:35:05 crc kubenswrapper[4830]: E0311 09:35:05.422249 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-lqwp9" podUID="fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.437093 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jxw5l"] Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.443100 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-jxw5l"] Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.492430 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bdc7dd545-m2jlq"] Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.499864 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6bdc7dd545-m2jlq"] Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.510818 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:35:05 crc kubenswrapper[4830]: E0311 09:35:05.513395 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 11 09:35:05 crc kubenswrapper[4830]: E0311 09:35:05.513635 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599hdh687h56dh5d7h664h64ch58h68dhbch5f8h5bfhc7h5b4h5f4h5fdh5b7h76hf4hb7h66h597h569h676h688hf5hcbh66bh64dh678h644h76q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b9r6v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(bc3adf05-3cb9-4fda-be48-67b6b3084179): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.515746 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.516374 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.532558 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.544892 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:35:05 crc kubenswrapper[4830]: E0311 09:35:05.545294 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="init" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.545306 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="init" Mar 11 09:35:05 crc kubenswrapper[4830]: E0311 09:35:05.545323 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="dnsmasq-dns" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.545330 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="dnsmasq-dns" Mar 11 09:35:05 crc kubenswrapper[4830]: E0311 09:35:05.545341 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-httpd" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.545347 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-httpd" Mar 11 09:35:05 crc kubenswrapper[4830]: E0311 09:35:05.545358 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-log" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.545365 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-log" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.545526 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="dnsmasq-dns" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.545539 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-httpd" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.545552 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" containerName="glance-log" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.546429 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.552875 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.553609 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.592535 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.657519 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jppmn\" (UniqueName: \"kubernetes.io/projected/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-kube-api-access-jppmn\") pod \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.657640 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-config-data\") pod \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.657792 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-logs\") pod \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.657847 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-scripts\") pod \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.657901 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-horizon-secret-key\") pod \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\" (UID: \"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc\") " Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658197 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658249 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658283 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658367 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658394 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658472 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658494 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.658543 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqlcm\" (UniqueName: \"kubernetes.io/projected/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-kube-api-access-mqlcm\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.659820 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-config-data" (OuterVolumeSpecName: "config-data") pod "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc" (UID: "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.660375 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-logs" (OuterVolumeSpecName: "logs") pod "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc" (UID: "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.660711 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-scripts" (OuterVolumeSpecName: "scripts") pod "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc" (UID: "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.678124 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc" (UID: "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.678609 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-kube-api-access-jppmn" (OuterVolumeSpecName: "kube-api-access-jppmn") pod "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc" (UID: "7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc"). InnerVolumeSpecName "kube-api-access-jppmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760115 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760171 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760220 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760243 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760291 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqlcm\" (UniqueName: \"kubernetes.io/projected/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-kube-api-access-mqlcm\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760376 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760405 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760516 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760530 4830 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760542 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jppmn\" (UniqueName: \"kubernetes.io/projected/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-kube-api-access-jppmn\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760553 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760566 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760660 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-logs\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.760761 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.761475 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.770535 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.779952 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.782127 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.782248 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.804839 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqlcm\" (UniqueName: \"kubernetes.io/projected/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-kube-api-access-mqlcm\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.842979 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:35:05 crc kubenswrapper[4830]: I0311 09:35:05.900263 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.000389 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74dc88fc-jxw5l" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: i/o timeout" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.428409 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74675f7bbf-5c9lf" event={"ID":"7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc","Type":"ContainerDied","Data":"d43426154ae4d0c6cf438c414c1446abba3f284ac0e3720272bf353de5abf9e5"} Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.428451 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74675f7bbf-5c9lf" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.456595 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.529890 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74675f7bbf-5c9lf"] Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.554894 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74675f7bbf-5c9lf"] Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.572865 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsg8f\" (UniqueName: \"kubernetes.io/projected/3076c40d-fd20-4012-b09f-7a44a031ae59-kube-api-access-xsg8f\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.572910 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-config-data\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.572981 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-logs\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.572999 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-scripts\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.573048 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-combined-ca-bundle\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.573086 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-public-tls-certs\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.573134 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-httpd-run\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.573453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3076c40d-fd20-4012-b09f-7a44a031ae59\" (UID: \"3076c40d-fd20-4012-b09f-7a44a031ae59\") " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.574234 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.574664 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.574676 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-logs" (OuterVolumeSpecName: "logs") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.577565 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-scripts" (OuterVolumeSpecName: "scripts") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.579924 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3076c40d-fd20-4012-b09f-7a44a031ae59-kube-api-access-xsg8f" (OuterVolumeSpecName: "kube-api-access-xsg8f") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "kube-api-access-xsg8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.580117 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.589901 4830 scope.go:117] "RemoveContainer" containerID="50208acc0f85fbd28984ce3f29c0c7ccc77a2c0ea461d3f727357bff380188aa" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.605523 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.633490 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-config-data" (OuterVolumeSpecName: "config-data") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.636823 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3076c40d-fd20-4012-b09f-7a44a031ae59" (UID: "3076c40d-fd20-4012-b09f-7a44a031ae59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:06 crc kubenswrapper[4830]: E0311 09:35:06.658710 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 11 09:35:06 crc kubenswrapper[4830]: E0311 09:35:06.659287 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2hl6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-pl7d8_openstack(045501ed-58bb-4a38-9b4a-5091217cf610): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:35:06 crc kubenswrapper[4830]: E0311 09:35:06.660784 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-pl7d8" podUID="045501ed-58bb-4a38-9b4a-5091217cf610" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.677233 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.678116 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.678153 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.678163 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsg8f\" (UniqueName: \"kubernetes.io/projected/3076c40d-fd20-4012-b09f-7a44a031ae59-kube-api-access-xsg8f\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.678173 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.678181 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3076c40d-fd20-4012-b09f-7a44a031ae59-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.678191 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3076c40d-fd20-4012-b09f-7a44a031ae59-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.701849 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.725764 4830 scope.go:117] "RemoveContainer" containerID="268e9b643795281863663820770ee1e949311d0d5a87d80799fbc19bc2b6c320" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.783816 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.796605 4830 scope.go:117] "RemoveContainer" containerID="ee9080de41c880e997ed147ebad7dc87ddbe02e0a39e92ce73c0c86d47080c60" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.949537 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc" path="/var/lib/kubelet/pods/7f1f2c2d-80c5-4859-9cbd-1fc0732fcabc/volumes" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.950303 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8" path="/var/lib/kubelet/pods/9bbe2736-f9c5-46d4-be6e-ed4f9909c2b8/volumes" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.951191 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad4e154-c667-41c1-9bf1-bf53a07a15b1" path="/var/lib/kubelet/pods/aad4e154-c667-41c1-9bf1-bf53a07a15b1/volumes" Mar 11 09:35:06 crc kubenswrapper[4830]: I0311 09:35:06.952615 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e60a5c-559c-4667-adf4-14e8e7066569" path="/var/lib/kubelet/pods/c1e60a5c-559c-4667-adf4-14e8e7066569/volumes" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.046642 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f6b87df74-q5t2v"] Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.054501 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-789dc4b6cd-xz7ds"] Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.179629 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p47hl"] Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.204837 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.270152 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:35:07 crc kubenswrapper[4830]: W0311 09:35:07.272954 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65344ab0_3c56_4a1b_ac72_ef54fbf8da4a.slice/crio-a274c53d5ceb4bd315c3002caff9cf7d224f34f5b58efd0471d0090f3e9935a8 WatchSource:0}: Error finding container a274c53d5ceb4bd315c3002caff9cf7d224f34f5b58efd0471d0090f3e9935a8: Status 404 returned error can't find the container with id a274c53d5ceb4bd315c3002caff9cf7d224f34f5b58efd0471d0090f3e9935a8 Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.437279 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a","Type":"ContainerStarted","Data":"a274c53d5ceb4bd315c3002caff9cf7d224f34f5b58efd0471d0090f3e9935a8"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.438516 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p47hl" event={"ID":"fdcc3064-6041-40ab-b12e-6ca3f6bd6884","Type":"ContainerStarted","Data":"45c8e215cc6467b19e2ecfdba9d1fe3a19a176ac663aa857e1996bb9f61e13a2"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.439710 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6b87df74-q5t2v" event={"ID":"242c5a27-bc92-42f0-b630-6d1f3cd55822","Type":"ContainerStarted","Data":"fe95df3fb45002513d42a2ac462809f4eaa5ba7dcd9f05458b08da62b1a8d992"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.447336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wcrf" event={"ID":"9713cf71-536f-4674-8184-7c7651dad952","Type":"ContainerStarted","Data":"d489ff19093d1cfd7ba882d6f3ecd629bf5e50fd1c1e140deb578d0045dd862c"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.448171 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789dc4b6cd-xz7ds" event={"ID":"77e86c78-b565-4e6c-8867-519fa2d5137a","Type":"ContainerStarted","Data":"59019a04401f7a116d8cb399f5c7dea95cf187126da7857b192c0f895efb7f65"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.456461 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3076c40d-fd20-4012-b09f-7a44a031ae59","Type":"ContainerDied","Data":"04f0742a58d1a6c5d1aab9a68f42a8f216862b5cc45ee1a446a08361162ee089"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.456511 4830 scope.go:117] "RemoveContainer" containerID="e447dbd80eec7f016f80641594c602759cce2b8faac5bfdec3fa8860cd7ea29c" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.456630 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.467734 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2wcrf" podStartSLOduration=4.736962451 podStartE2EDuration="34.467710708s" podCreationTimestamp="2026-03-11 09:34:33 +0000 UTC" firstStartedPulling="2026-03-11 09:34:34.907736968 +0000 UTC m=+1242.688887657" lastFinishedPulling="2026-03-11 09:35:04.638485225 +0000 UTC m=+1272.419635914" observedRunningTime="2026-03-11 09:35:07.458892563 +0000 UTC m=+1275.240043252" watchObservedRunningTime="2026-03-11 09:35:07.467710708 +0000 UTC m=+1275.248861397" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.488792 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"03ad7e1a043295e4aa7b1e55fe231f4db92b6212406c1d4afd870d0da1454e73"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.488849 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"db4bedf3-ea20-4a63-9623-96286e9b243b","Type":"ContainerStarted","Data":"2fb3d28006aad45e04d10bc17d79e33fa1a6152f2e366aa03fcafc8aa6f31a28"} Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.493069 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.506228 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.514218 4830 scope.go:117] "RemoveContainer" containerID="fbdedc2e00c9155833f491dc70f345507857db496bde9aa0567ff5f0cc4e06ca" Mar 11 09:35:07 crc kubenswrapper[4830]: E0311 09:35:07.514278 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-pl7d8" podUID="045501ed-58bb-4a38-9b4a-5091217cf610" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.522744 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:35:07 crc kubenswrapper[4830]: E0311 09:35:07.523117 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-httpd" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.523133 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-httpd" Mar 11 09:35:07 crc kubenswrapper[4830]: E0311 09:35:07.523144 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-log" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.523150 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-log" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.523318 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-httpd" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.523327 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" containerName="glance-log" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.524235 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.529007 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.529006 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.553060 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.701448 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.701491 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.701552 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.701622 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.701643 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmt97\" (UniqueName: \"kubernetes.io/projected/0e4a85e2-f1a7-4463-8795-55508a60df90-kube-api-access-qmt97\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.701693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-logs\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.701976 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.702155 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.803847 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.803929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.803972 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.804002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.804115 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.804178 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.804201 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmt97\" (UniqueName: \"kubernetes.io/projected/0e4a85e2-f1a7-4463-8795-55508a60df90-kube-api-access-qmt97\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.804246 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-logs\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.804882 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.805158 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.805243 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-logs\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.808846 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.810660 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.827104 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.829200 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmt97\" (UniqueName: \"kubernetes.io/projected/0e4a85e2-f1a7-4463-8795-55508a60df90-kube-api-access-qmt97\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.837178 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.844551 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " pod="openstack/glance-default-external-api-0" Mar 11 09:35:07 crc kubenswrapper[4830]: I0311 09:35:07.851626 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.501939 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a","Type":"ContainerStarted","Data":"2b540663e7c741945a9ec92868556e9360efff84fd073507c873423b1393f6a6"} Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.503629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p47hl" event={"ID":"fdcc3064-6041-40ab-b12e-6ca3f6bd6884","Type":"ContainerStarted","Data":"a7cdb60fefc1d286dc9762da68be43b5b76aed17670411c8fe3c05d2c4dec84d"} Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.520192 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p47hl" podStartSLOduration=17.5201663 podStartE2EDuration="17.5201663s" podCreationTimestamp="2026-03-11 09:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:08.516955621 +0000 UTC m=+1276.298106320" watchObservedRunningTime="2026-03-11 09:35:08.5201663 +0000 UTC m=+1276.301317009" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.730737 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=69.337904252 podStartE2EDuration="1m17.730717489s" podCreationTimestamp="2026-03-11 09:33:51 +0000 UTC" firstStartedPulling="2026-03-11 09:34:26.507402823 +0000 UTC m=+1234.288553512" lastFinishedPulling="2026-03-11 09:34:34.90021606 +0000 UTC m=+1242.681366749" observedRunningTime="2026-03-11 09:35:08.570442774 +0000 UTC m=+1276.351593483" watchObservedRunningTime="2026-03-11 09:35:08.730717489 +0000 UTC m=+1276.511868178" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.736307 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.842585 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vdn9k"] Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.844710 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.848292 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.856635 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vdn9k"] Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.937328 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.937591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpgj8\" (UniqueName: \"kubernetes.io/projected/99501380-81f7-4f53-8b8e-ca9e4cd51567-kube-api-access-fpgj8\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.937662 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.937856 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.937924 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-config\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.937953 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:08 crc kubenswrapper[4830]: I0311 09:35:08.944111 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3076c40d-fd20-4012-b09f-7a44a031ae59" path="/var/lib/kubelet/pods/3076c40d-fd20-4012-b09f-7a44a031ae59/volumes" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.041212 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.041562 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.041614 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-config\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.041646 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.041716 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.041798 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpgj8\" (UniqueName: \"kubernetes.io/projected/99501380-81f7-4f53-8b8e-ca9e4cd51567-kube-api-access-fpgj8\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.043945 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.044615 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-config\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.044955 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.045120 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.045648 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.070170 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpgj8\" (UniqueName: \"kubernetes.io/projected/99501380-81f7-4f53-8b8e-ca9e4cd51567-kube-api-access-fpgj8\") pod \"dnsmasq-dns-57c957c4ff-vdn9k\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.174384 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.515137 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a","Type":"ContainerStarted","Data":"e1c2ee352a1aa9ad3a87217fdc9b26896b4b6b672422c1eb960b0604da915f64"} Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.516897 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e4a85e2-f1a7-4463-8795-55508a60df90","Type":"ContainerStarted","Data":"da1881f79daaf08d17b8939b6424cf94d32e1f688d2cdfbd6425b78891405c5c"} Mar 11 09:35:09 crc kubenswrapper[4830]: I0311 09:35:09.541120 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.541092846 podStartE2EDuration="4.541092846s" podCreationTimestamp="2026-03-11 09:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:09.539476601 +0000 UTC m=+1277.320627330" watchObservedRunningTime="2026-03-11 09:35:09.541092846 +0000 UTC m=+1277.322243605" Mar 11 09:35:10 crc kubenswrapper[4830]: I0311 09:35:10.527427 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerStarted","Data":"535fb6734fa534d36c37e754401c45c8c8f4f668b2eeb53e3653430c917a55fb"} Mar 11 09:35:10 crc kubenswrapper[4830]: I0311 09:35:10.529351 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e4a85e2-f1a7-4463-8795-55508a60df90","Type":"ContainerStarted","Data":"c17c3e29f31e18bb52beea83b03bc239545f7ca00d58d5b0c72578eb927d435a"} Mar 11 09:35:10 crc kubenswrapper[4830]: I0311 09:35:10.537925 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789dc4b6cd-xz7ds" event={"ID":"77e86c78-b565-4e6c-8867-519fa2d5137a","Type":"ContainerStarted","Data":"a08190e1639196650ac1252846f983ef333596567a0d2e97142f0479f559d875"} Mar 11 09:35:10 crc kubenswrapper[4830]: I0311 09:35:10.540711 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6b87df74-q5t2v" event={"ID":"242c5a27-bc92-42f0-b630-6d1f3cd55822","Type":"ContainerStarted","Data":"af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10"} Mar 11 09:35:10 crc kubenswrapper[4830]: I0311 09:35:10.599119 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vdn9k"] Mar 11 09:35:10 crc kubenswrapper[4830]: W0311 09:35:10.614052 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99501380_81f7_4f53_8b8e_ca9e4cd51567.slice/crio-8dfaa19546eea2435461d4eb07cdaa5913c4b48426f15011bebf701f3d8770e0 WatchSource:0}: Error finding container 8dfaa19546eea2435461d4eb07cdaa5913c4b48426f15011bebf701f3d8770e0: Status 404 returned error can't find the container with id 8dfaa19546eea2435461d4eb07cdaa5913c4b48426f15011bebf701f3d8770e0 Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.553663 4830 generic.go:334] "Generic (PLEG): container finished" podID="fdcc3064-6041-40ab-b12e-6ca3f6bd6884" containerID="a7cdb60fefc1d286dc9762da68be43b5b76aed17670411c8fe3c05d2c4dec84d" exitCode=0 Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.553757 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p47hl" event={"ID":"fdcc3064-6041-40ab-b12e-6ca3f6bd6884","Type":"ContainerDied","Data":"a7cdb60fefc1d286dc9762da68be43b5b76aed17670411c8fe3c05d2c4dec84d"} Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.556291 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e4a85e2-f1a7-4463-8795-55508a60df90","Type":"ContainerStarted","Data":"c2440926a74fefbe03c6e61dd97818c5b356bcd468059234f6638c9e396999e2"} Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.559172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-789dc4b6cd-xz7ds" event={"ID":"77e86c78-b565-4e6c-8867-519fa2d5137a","Type":"ContainerStarted","Data":"8d089bc48ea79225eda2731b039cef2b29a99b502217c0bdde52848441f0b65b"} Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.561398 4830 generic.go:334] "Generic (PLEG): container finished" podID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerID="7138e4ba50c1b33f9fb2b7165fbf23990b1688348cc88054a79fd487a85df55a" exitCode=0 Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.561481 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" event={"ID":"99501380-81f7-4f53-8b8e-ca9e4cd51567","Type":"ContainerDied","Data":"7138e4ba50c1b33f9fb2b7165fbf23990b1688348cc88054a79fd487a85df55a"} Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.561613 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" event={"ID":"99501380-81f7-4f53-8b8e-ca9e4cd51567","Type":"ContainerStarted","Data":"8dfaa19546eea2435461d4eb07cdaa5913c4b48426f15011bebf701f3d8770e0"} Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.580854 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6b87df74-q5t2v" event={"ID":"242c5a27-bc92-42f0-b630-6d1f3cd55822","Type":"ContainerStarted","Data":"e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8"} Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.657982 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-789dc4b6cd-xz7ds" podStartSLOduration=27.564943302 podStartE2EDuration="30.657962211s" podCreationTimestamp="2026-03-11 09:34:41 +0000 UTC" firstStartedPulling="2026-03-11 09:35:07.070626395 +0000 UTC m=+1274.851777074" lastFinishedPulling="2026-03-11 09:35:10.163645294 +0000 UTC m=+1277.944795983" observedRunningTime="2026-03-11 09:35:11.653343033 +0000 UTC m=+1279.434493742" watchObservedRunningTime="2026-03-11 09:35:11.657962211 +0000 UTC m=+1279.439112890" Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.716390 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.716363531 podStartE2EDuration="4.716363531s" podCreationTimestamp="2026-03-11 09:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:11.676359171 +0000 UTC m=+1279.457509880" watchObservedRunningTime="2026-03-11 09:35:11.716363531 +0000 UTC m=+1279.497514220" Mar 11 09:35:11 crc kubenswrapper[4830]: I0311 09:35:11.822208 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5f6b87df74-q5t2v" podStartSLOduration=27.730836972 podStartE2EDuration="30.822166625s" podCreationTimestamp="2026-03-11 09:34:41 +0000 UTC" firstStartedPulling="2026-03-11 09:35:07.067571139 +0000 UTC m=+1274.848721828" lastFinishedPulling="2026-03-11 09:35:10.158900792 +0000 UTC m=+1277.940051481" observedRunningTime="2026-03-11 09:35:11.711187557 +0000 UTC m=+1279.492338266" watchObservedRunningTime="2026-03-11 09:35:11.822166625 +0000 UTC m=+1279.603317314" Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.010075 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.010193 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.338431 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.339110 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.597219 4830 generic.go:334] "Generic (PLEG): container finished" podID="9713cf71-536f-4674-8184-7c7651dad952" containerID="d489ff19093d1cfd7ba882d6f3ecd629bf5e50fd1c1e140deb578d0045dd862c" exitCode=0 Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.597289 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wcrf" event={"ID":"9713cf71-536f-4674-8184-7c7651dad952","Type":"ContainerDied","Data":"d489ff19093d1cfd7ba882d6f3ecd629bf5e50fd1c1e140deb578d0045dd862c"} Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.602566 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" event={"ID":"99501380-81f7-4f53-8b8e-ca9e4cd51567","Type":"ContainerStarted","Data":"11f5b495c9f37f75b9c337537150813638b88922858fcfdec58d43ae4efd47ee"} Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.652442 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" podStartSLOduration=4.650990194 podStartE2EDuration="4.650990194s" podCreationTimestamp="2026-03-11 09:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:12.640327028 +0000 UTC m=+1280.421477737" watchObservedRunningTime="2026-03-11 09:35:12.650990194 +0000 UTC m=+1280.432140883" Mar 11 09:35:12 crc kubenswrapper[4830]: I0311 09:35:12.972609 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.051593 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-credential-keys\") pod \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.051643 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-config-data\") pod \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.051666 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-scripts\") pod \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.051726 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-fernet-keys\") pod \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.051838 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-combined-ca-bundle\") pod \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.051952 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29c95\" (UniqueName: \"kubernetes.io/projected/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-kube-api-access-29c95\") pod \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\" (UID: \"fdcc3064-6041-40ab-b12e-6ca3f6bd6884\") " Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.069984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-kube-api-access-29c95" (OuterVolumeSpecName: "kube-api-access-29c95") pod "fdcc3064-6041-40ab-b12e-6ca3f6bd6884" (UID: "fdcc3064-6041-40ab-b12e-6ca3f6bd6884"). InnerVolumeSpecName "kube-api-access-29c95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.070009 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fdcc3064-6041-40ab-b12e-6ca3f6bd6884" (UID: "fdcc3064-6041-40ab-b12e-6ca3f6bd6884"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.080793 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fdcc3064-6041-40ab-b12e-6ca3f6bd6884" (UID: "fdcc3064-6041-40ab-b12e-6ca3f6bd6884"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.084895 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-scripts" (OuterVolumeSpecName: "scripts") pod "fdcc3064-6041-40ab-b12e-6ca3f6bd6884" (UID: "fdcc3064-6041-40ab-b12e-6ca3f6bd6884"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.086154 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-config-data" (OuterVolumeSpecName: "config-data") pod "fdcc3064-6041-40ab-b12e-6ca3f6bd6884" (UID: "fdcc3064-6041-40ab-b12e-6ca3f6bd6884"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.092680 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdcc3064-6041-40ab-b12e-6ca3f6bd6884" (UID: "fdcc3064-6041-40ab-b12e-6ca3f6bd6884"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.154274 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.154313 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.154325 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29c95\" (UniqueName: \"kubernetes.io/projected/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-kube-api-access-29c95\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.154334 4830 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.154343 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.154351 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdcc3064-6041-40ab-b12e-6ca3f6bd6884-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.616748 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p47hl" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.621581 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p47hl" event={"ID":"fdcc3064-6041-40ab-b12e-6ca3f6bd6884","Type":"ContainerDied","Data":"45c8e215cc6467b19e2ecfdba9d1fe3a19a176ac663aa857e1996bb9f61e13a2"} Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.621646 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45c8e215cc6467b19e2ecfdba9d1fe3a19a176ac663aa857e1996bb9f61e13a2" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.622496 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.671329 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-559977bfdc-r7ssx"] Mar 11 09:35:13 crc kubenswrapper[4830]: E0311 09:35:13.671977 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdcc3064-6041-40ab-b12e-6ca3f6bd6884" containerName="keystone-bootstrap" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.672006 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdcc3064-6041-40ab-b12e-6ca3f6bd6884" containerName="keystone-bootstrap" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.672323 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdcc3064-6041-40ab-b12e-6ca3f6bd6884" containerName="keystone-bootstrap" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.677389 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.686403 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.686541 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.686826 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q9l9c" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.686602 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.686650 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.686742 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.717493 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-559977bfdc-r7ssx"] Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775466 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-credential-keys\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775548 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-combined-ca-bundle\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775600 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-config-data\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-fernet-keys\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9g2j\" (UniqueName: \"kubernetes.io/projected/3d8403ac-71e1-41f2-a897-bf61055308f6-kube-api-access-z9g2j\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775763 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-scripts\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775797 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-public-tls-certs\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.775825 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-internal-tls-certs\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877544 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9g2j\" (UniqueName: \"kubernetes.io/projected/3d8403ac-71e1-41f2-a897-bf61055308f6-kube-api-access-z9g2j\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877643 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-scripts\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877686 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-public-tls-certs\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-internal-tls-certs\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877776 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-credential-keys\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877803 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-combined-ca-bundle\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877834 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-config-data\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.877861 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-fernet-keys\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.889655 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-scripts\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.889947 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-combined-ca-bundle\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.890547 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-internal-tls-certs\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.891472 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-credential-keys\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.895889 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-fernet-keys\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.900130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9g2j\" (UniqueName: \"kubernetes.io/projected/3d8403ac-71e1-41f2-a897-bf61055308f6-kube-api-access-z9g2j\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.902699 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-public-tls-certs\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:13 crc kubenswrapper[4830]: I0311 09:35:13.916234 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8403ac-71e1-41f2-a897-bf61055308f6-config-data\") pod \"keystone-559977bfdc-r7ssx\" (UID: \"3d8403ac-71e1-41f2-a897-bf61055308f6\") " pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:14 crc kubenswrapper[4830]: I0311 09:35:14.032453 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:14 crc kubenswrapper[4830]: I0311 09:35:14.627859 4830 generic.go:334] "Generic (PLEG): container finished" podID="253897f0-4649-46c8-9bb3-9d25a4864701" containerID="1cf30dcfc616d5220609e5b56d96126ad66cb4802eee570f8d75ef7e0185826e" exitCode=0 Mar 11 09:35:14 crc kubenswrapper[4830]: I0311 09:35:14.628048 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fwspb" event={"ID":"253897f0-4649-46c8-9bb3-9d25a4864701","Type":"ContainerDied","Data":"1cf30dcfc616d5220609e5b56d96126ad66cb4802eee570f8d75ef7e0185826e"} Mar 11 09:35:15 crc kubenswrapper[4830]: I0311 09:35:15.901435 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:15 crc kubenswrapper[4830]: I0311 09:35:15.901851 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:15 crc kubenswrapper[4830]: I0311 09:35:15.943989 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:15 crc kubenswrapper[4830]: I0311 09:35:15.967111 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:16 crc kubenswrapper[4830]: I0311 09:35:16.650712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:16 crc kubenswrapper[4830]: I0311 09:35:16.650757 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.133237 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wcrf" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.168057 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fwspb" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.245679 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-combined-ca-bundle\") pod \"253897f0-4649-46c8-9bb3-9d25a4864701\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.245774 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg2wk\" (UniqueName: \"kubernetes.io/projected/9713cf71-536f-4674-8184-7c7651dad952-kube-api-access-rg2wk\") pod \"9713cf71-536f-4674-8184-7c7651dad952\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.245816 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-scripts\") pod \"9713cf71-536f-4674-8184-7c7651dad952\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.245861 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-config\") pod \"253897f0-4649-46c8-9bb3-9d25a4864701\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.245937 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9713cf71-536f-4674-8184-7c7651dad952-logs\") pod \"9713cf71-536f-4674-8184-7c7651dad952\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.246049 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncrfb\" (UniqueName: \"kubernetes.io/projected/253897f0-4649-46c8-9bb3-9d25a4864701-kube-api-access-ncrfb\") pod \"253897f0-4649-46c8-9bb3-9d25a4864701\" (UID: \"253897f0-4649-46c8-9bb3-9d25a4864701\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.246086 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-combined-ca-bundle\") pod \"9713cf71-536f-4674-8184-7c7651dad952\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.246178 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-config-data\") pod \"9713cf71-536f-4674-8184-7c7651dad952\" (UID: \"9713cf71-536f-4674-8184-7c7651dad952\") " Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.253899 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9713cf71-536f-4674-8184-7c7651dad952-kube-api-access-rg2wk" (OuterVolumeSpecName: "kube-api-access-rg2wk") pod "9713cf71-536f-4674-8184-7c7651dad952" (UID: "9713cf71-536f-4674-8184-7c7651dad952"). InnerVolumeSpecName "kube-api-access-rg2wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.254172 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9713cf71-536f-4674-8184-7c7651dad952-logs" (OuterVolumeSpecName: "logs") pod "9713cf71-536f-4674-8184-7c7651dad952" (UID: "9713cf71-536f-4674-8184-7c7651dad952"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.254332 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253897f0-4649-46c8-9bb3-9d25a4864701-kube-api-access-ncrfb" (OuterVolumeSpecName: "kube-api-access-ncrfb") pod "253897f0-4649-46c8-9bb3-9d25a4864701" (UID: "253897f0-4649-46c8-9bb3-9d25a4864701"). InnerVolumeSpecName "kube-api-access-ncrfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.272150 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-scripts" (OuterVolumeSpecName: "scripts") pod "9713cf71-536f-4674-8184-7c7651dad952" (UID: "9713cf71-536f-4674-8184-7c7651dad952"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.296934 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-config-data" (OuterVolumeSpecName: "config-data") pod "9713cf71-536f-4674-8184-7c7651dad952" (UID: "9713cf71-536f-4674-8184-7c7651dad952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.321552 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-config" (OuterVolumeSpecName: "config") pod "253897f0-4649-46c8-9bb3-9d25a4864701" (UID: "253897f0-4649-46c8-9bb3-9d25a4864701"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.323194 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "253897f0-4649-46c8-9bb3-9d25a4864701" (UID: "253897f0-4649-46c8-9bb3-9d25a4864701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.330105 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9713cf71-536f-4674-8184-7c7651dad952" (UID: "9713cf71-536f-4674-8184-7c7651dad952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347762 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347800 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347817 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg2wk\" (UniqueName: \"kubernetes.io/projected/9713cf71-536f-4674-8184-7c7651dad952-kube-api-access-rg2wk\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347826 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347837 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/253897f0-4649-46c8-9bb3-9d25a4864701-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347845 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9713cf71-536f-4674-8184-7c7651dad952-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347853 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncrfb\" (UniqueName: \"kubernetes.io/projected/253897f0-4649-46c8-9bb3-9d25a4864701-kube-api-access-ncrfb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.347861 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9713cf71-536f-4674-8184-7c7651dad952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.513911 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-559977bfdc-r7ssx"] Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.674974 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fwspb" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.676258 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fwspb" event={"ID":"253897f0-4649-46c8-9bb3-9d25a4864701","Type":"ContainerDied","Data":"0a300820ab826816cc64c23d1843313baca9f09ff15f8786c31435e87f203207"} Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.676516 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a300820ab826816cc64c23d1843313baca9f09ff15f8786c31435e87f203207" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.680187 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wcrf" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.680758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wcrf" event={"ID":"9713cf71-536f-4674-8184-7c7651dad952","Type":"ContainerDied","Data":"3f370bf2dbe7bc49ab032f6808efdbe375ecc503d49c6ced440eeff0a08554d2"} Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.680863 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f370bf2dbe7bc49ab032f6808efdbe375ecc503d49c6ced440eeff0a08554d2" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.852798 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.852855 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.893349 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 09:35:17 crc kubenswrapper[4830]: I0311 09:35:17.903239 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.424829 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-599b4448-86g7s"] Mar 11 09:35:18 crc kubenswrapper[4830]: E0311 09:35:18.425534 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253897f0-4649-46c8-9bb3-9d25a4864701" containerName="neutron-db-sync" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.425547 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="253897f0-4649-46c8-9bb3-9d25a4864701" containerName="neutron-db-sync" Mar 11 09:35:18 crc kubenswrapper[4830]: E0311 09:35:18.425578 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9713cf71-536f-4674-8184-7c7651dad952" containerName="placement-db-sync" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.425601 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9713cf71-536f-4674-8184-7c7651dad952" containerName="placement-db-sync" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.425818 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9713cf71-536f-4674-8184-7c7651dad952" containerName="placement-db-sync" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.425857 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="253897f0-4649-46c8-9bb3-9d25a4864701" containerName="neutron-db-sync" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.426823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.432754 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.433047 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qjp5l" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.433253 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.433946 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.439933 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.447900 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vdn9k"] Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.448148 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerName="dnsmasq-dns" containerID="cri-o://11f5b495c9f37f75b9c337537150813638b88922858fcfdec58d43ae4efd47ee" gracePeriod=10 Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.455992 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467234 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-scripts\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467279 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-logs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467328 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-public-tls-certs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467360 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-internal-tls-certs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-config-data\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467433 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-combined-ca-bundle\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467472 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjd67\" (UniqueName: \"kubernetes.io/projected/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-kube-api-access-qjd67\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.467608 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-599b4448-86g7s"] Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.555498 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-ntjpw"] Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.557371 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.574415 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-scripts\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.574469 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-logs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.574517 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-public-tls-certs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.574549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-internal-tls-certs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.574592 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-config-data\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.574613 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-combined-ca-bundle\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.574652 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjd67\" (UniqueName: \"kubernetes.io/projected/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-kube-api-access-qjd67\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.578873 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-logs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.598118 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjd67\" (UniqueName: \"kubernetes.io/projected/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-kube-api-access-qjd67\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.598222 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-scripts\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.600576 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-internal-tls-certs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.604200 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-config-data\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.604283 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-combined-ca-bundle\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.605610 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c023e-1cf1-4a06-8c20-2b79612f7ae8-public-tls-certs\") pod \"placement-599b4448-86g7s\" (UID: \"da8c023e-1cf1-4a06-8c20-2b79612f7ae8\") " pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.629086 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-ntjpw"] Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.655720 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-586c785596-k4qp7"] Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.657219 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.660634 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.660822 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sxs8x" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.660962 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.661160 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.669033 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-586c785596-k4qp7"] Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675620 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675658 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26xqv\" (UniqueName: \"kubernetes.io/projected/af048295-8bc1-42cb-8f67-3049b2dc4215-kube-api-access-26xqv\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675712 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675734 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-config\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675763 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-config\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675808 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-combined-ca-bundle\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675832 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-ovndb-tls-certs\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675848 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7rk\" (UniqueName: \"kubernetes.io/projected/addc50b8-ce68-4690-9979-07b2f596215d-kube-api-access-fb7rk\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675863 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675897 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-httpd-config\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.675926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.727109 4830 generic.go:334] "Generic (PLEG): container finished" podID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerID="11f5b495c9f37f75b9c337537150813638b88922858fcfdec58d43ae4efd47ee" exitCode=0 Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.728616 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" event={"ID":"99501380-81f7-4f53-8b8e-ca9e4cd51567","Type":"ContainerDied","Data":"11f5b495c9f37f75b9c337537150813638b88922858fcfdec58d43ae4efd47ee"} Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.728663 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.728826 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.749482 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.777954 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-combined-ca-bundle\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-ovndb-tls-certs\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7rk\" (UniqueName: \"kubernetes.io/projected/addc50b8-ce68-4690-9979-07b2f596215d-kube-api-access-fb7rk\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778072 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778142 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-httpd-config\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26xqv\" (UniqueName: \"kubernetes.io/projected/af048295-8bc1-42cb-8f67-3049b2dc4215-kube-api-access-26xqv\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778419 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778458 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-config\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.778507 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-config\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.780517 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-config\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.780542 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.781186 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.782004 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.784838 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-combined-ca-bundle\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.788871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.789839 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-httpd-config\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.793887 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-ovndb-tls-certs\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.804987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7rk\" (UniqueName: \"kubernetes.io/projected/addc50b8-ce68-4690-9979-07b2f596215d-kube-api-access-fb7rk\") pod \"dnsmasq-dns-5ccc5c4795-ntjpw\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.808356 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26xqv\" (UniqueName: \"kubernetes.io/projected/af048295-8bc1-42cb-8f67-3049b2dc4215-kube-api-access-26xqv\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:18 crc kubenswrapper[4830]: I0311 09:35:18.814198 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-config\") pod \"neutron-586c785596-k4qp7\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:19 crc kubenswrapper[4830]: I0311 09:35:19.010440 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:19 crc kubenswrapper[4830]: I0311 09:35:19.020227 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:19 crc kubenswrapper[4830]: I0311 09:35:19.175230 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Mar 11 09:35:19 crc kubenswrapper[4830]: I0311 09:35:19.506557 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:19 crc kubenswrapper[4830]: I0311 09:35:19.506720 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 09:35:19 crc kubenswrapper[4830]: I0311 09:35:19.561842 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 09:35:20 crc kubenswrapper[4830]: I0311 09:35:20.751119 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 09:35:20 crc kubenswrapper[4830]: I0311 09:35:20.751379 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 09:35:20 crc kubenswrapper[4830]: W0311 09:35:20.845875 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d8403ac_71e1_41f2_a897_bf61055308f6.slice/crio-19ad81e0b3010619228c3e09f7b3b22181296062075a564873d0844658d755e5 WatchSource:0}: Error finding container 19ad81e0b3010619228c3e09f7b3b22181296062075a564873d0844658d755e5: Status 404 returned error can't find the container with id 19ad81e0b3010619228c3e09f7b3b22181296062075a564873d0844658d755e5 Mar 11 09:35:20 crc kubenswrapper[4830]: I0311 09:35:20.900579 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54c74bff69-478cc"] Mar 11 09:35:20 crc kubenswrapper[4830]: I0311 09:35:20.903182 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:20 crc kubenswrapper[4830]: I0311 09:35:20.907853 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 11 09:35:20 crc kubenswrapper[4830]: I0311 09:35:20.910926 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 11 09:35:20 crc kubenswrapper[4830]: I0311 09:35:20.914039 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54c74bff69-478cc"] Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.038570 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-combined-ca-bundle\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.038674 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-ovndb-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.038701 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-httpd-config\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.038720 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-config\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.038740 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-public-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.038757 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-internal-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.038825 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqch5\" (UniqueName: \"kubernetes.io/projected/55875815-4467-4c5e-8401-b220cb1694c6-kube-api-access-fqch5\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.153393 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-combined-ca-bundle\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.153502 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-httpd-config\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.153525 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-ovndb-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.153555 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-config\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.153594 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-public-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.153612 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-internal-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.153702 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqch5\" (UniqueName: \"kubernetes.io/projected/55875815-4467-4c5e-8401-b220cb1694c6-kube-api-access-fqch5\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.164595 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-ovndb-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.175787 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-combined-ca-bundle\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.175897 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-public-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.177805 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-internal-tls-certs\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.178323 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-config\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.181452 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/55875815-4467-4c5e-8401-b220cb1694c6-httpd-config\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.185939 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqch5\" (UniqueName: \"kubernetes.io/projected/55875815-4467-4c5e-8401-b220cb1694c6-kube-api-access-fqch5\") pod \"neutron-54c74bff69-478cc\" (UID: \"55875815-4467-4c5e-8401-b220cb1694c6\") " pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.386252 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.402695 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.491302 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-sb\") pod \"99501380-81f7-4f53-8b8e-ca9e4cd51567\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.491638 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-config\") pod \"99501380-81f7-4f53-8b8e-ca9e4cd51567\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.491677 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-nb\") pod \"99501380-81f7-4f53-8b8e-ca9e4cd51567\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.494213 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-svc\") pod \"99501380-81f7-4f53-8b8e-ca9e4cd51567\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.494585 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-swift-storage-0\") pod \"99501380-81f7-4f53-8b8e-ca9e4cd51567\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.494661 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpgj8\" (UniqueName: \"kubernetes.io/projected/99501380-81f7-4f53-8b8e-ca9e4cd51567-kube-api-access-fpgj8\") pod \"99501380-81f7-4f53-8b8e-ca9e4cd51567\" (UID: \"99501380-81f7-4f53-8b8e-ca9e4cd51567\") " Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.507457 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99501380-81f7-4f53-8b8e-ca9e4cd51567-kube-api-access-fpgj8" (OuterVolumeSpecName: "kube-api-access-fpgj8") pod "99501380-81f7-4f53-8b8e-ca9e4cd51567" (UID: "99501380-81f7-4f53-8b8e-ca9e4cd51567"). InnerVolumeSpecName "kube-api-access-fpgj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.594939 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99501380-81f7-4f53-8b8e-ca9e4cd51567" (UID: "99501380-81f7-4f53-8b8e-ca9e4cd51567"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.599451 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpgj8\" (UniqueName: \"kubernetes.io/projected/99501380-81f7-4f53-8b8e-ca9e4cd51567-kube-api-access-fpgj8\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.599482 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.633834 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-config" (OuterVolumeSpecName: "config") pod "99501380-81f7-4f53-8b8e-ca9e4cd51567" (UID: "99501380-81f7-4f53-8b8e-ca9e4cd51567"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.686415 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99501380-81f7-4f53-8b8e-ca9e4cd51567" (UID: "99501380-81f7-4f53-8b8e-ca9e4cd51567"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.688750 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99501380-81f7-4f53-8b8e-ca9e4cd51567" (UID: "99501380-81f7-4f53-8b8e-ca9e4cd51567"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.705396 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.705890 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.705903 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.730424 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99501380-81f7-4f53-8b8e-ca9e4cd51567" (UID: "99501380-81f7-4f53-8b8e-ca9e4cd51567"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.797502 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.806098 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-ntjpw"] Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.808724 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99501380-81f7-4f53-8b8e-ca9e4cd51567-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.815929 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.822336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-559977bfdc-r7ssx" event={"ID":"3d8403ac-71e1-41f2-a897-bf61055308f6","Type":"ContainerStarted","Data":"809f8316617faa259b98605532460e8d7823dfb09dd06e7485d579e61c08f47b"} Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.823912 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-559977bfdc-r7ssx" event={"ID":"3d8403ac-71e1-41f2-a897-bf61055308f6","Type":"ContainerStarted","Data":"19ad81e0b3010619228c3e09f7b3b22181296062075a564873d0844658d755e5"} Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.823935 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.836934 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerStarted","Data":"b63752527d4b2ca0b545c57d7b5c52f9778338607a2d1c8e1c1834b7d5c5b827"} Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.860593 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" event={"ID":"99501380-81f7-4f53-8b8e-ca9e4cd51567","Type":"ContainerDied","Data":"8dfaa19546eea2435461d4eb07cdaa5913c4b48426f15011bebf701f3d8770e0"} Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.860655 4830 scope.go:117] "RemoveContainer" containerID="11f5b495c9f37f75b9c337537150813638b88922858fcfdec58d43ae4efd47ee" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.860859 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-vdn9k" Mar 11 09:35:21 crc kubenswrapper[4830]: I0311 09:35:21.942149 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-559977bfdc-r7ssx" podStartSLOduration=8.942124778 podStartE2EDuration="8.942124778s" podCreationTimestamp="2026-03-11 09:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:21.894432225 +0000 UTC m=+1289.675582924" watchObservedRunningTime="2026-03-11 09:35:21.942124778 +0000 UTC m=+1289.723275467" Mar 11 09:35:22 crc kubenswrapper[4830]: W0311 09:35:22.002742 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf048295_8bc1_42cb_8f67_3049b2dc4215.slice/crio-f44aeabdbcdbebd6ed7ccb05a04910e8be6e86a94d8b632834a18e5063e2bb4c WatchSource:0}: Error finding container f44aeabdbcdbebd6ed7ccb05a04910e8be6e86a94d8b632834a18e5063e2bb4c: Status 404 returned error can't find the container with id f44aeabdbcdbebd6ed7ccb05a04910e8be6e86a94d8b632834a18e5063e2bb4c Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.020443 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-586c785596-k4qp7"] Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.041460 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5f6b87df74-q5t2v" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.080150 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-599b4448-86g7s"] Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.175344 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vdn9k"] Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.208443 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-vdn9k"] Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.248130 4830 scope.go:117] "RemoveContainer" containerID="7138e4ba50c1b33f9fb2b7165fbf23990b1688348cc88054a79fd487a85df55a" Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.367591 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-789dc4b6cd-xz7ds" podUID="77e86c78-b565-4e6c-8867-519fa2d5137a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.373550 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54c74bff69-478cc"] Mar 11 09:35:22 crc kubenswrapper[4830]: E0311 09:35:22.477261 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99501380_81f7_4f53_8b8e_ca9e4cd51567.slice\": RecentStats: unable to find data in memory cache]" Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.898960 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586c785596-k4qp7" event={"ID":"af048295-8bc1-42cb-8f67-3049b2dc4215","Type":"ContainerStarted","Data":"116297744a0bf959e008764ba107ace74be7cbcf5efe31f187e28c2080a3ec4c"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.899561 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586c785596-k4qp7" event={"ID":"af048295-8bc1-42cb-8f67-3049b2dc4215","Type":"ContainerStarted","Data":"f44aeabdbcdbebd6ed7ccb05a04910e8be6e86a94d8b632834a18e5063e2bb4c"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.919600 4830 generic.go:334] "Generic (PLEG): container finished" podID="addc50b8-ce68-4690-9979-07b2f596215d" containerID="c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b" exitCode=0 Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.919667 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" event={"ID":"addc50b8-ce68-4690-9979-07b2f596215d","Type":"ContainerDied","Data":"c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.919690 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" event={"ID":"addc50b8-ce68-4690-9979-07b2f596215d","Type":"ContainerStarted","Data":"fa262b4a620376170430edd07ffbdfc8b7be2c0bda120acef00ae4ed93ba4500"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.979666 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" path="/var/lib/kubelet/pods/99501380-81f7-4f53-8b8e-ca9e4cd51567/volumes" Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.980603 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl7d8" event={"ID":"045501ed-58bb-4a38-9b4a-5091217cf610","Type":"ContainerStarted","Data":"4b8a3a83ef1b2c06935e0d9d5e696aca65f4f0b637c4f517ce8592bd55f0f7fd"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.980651 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54c74bff69-478cc" event={"ID":"55875815-4467-4c5e-8401-b220cb1694c6","Type":"ContainerStarted","Data":"2c16ba24a2dbcab4ec9b6d5f8082e514fe2bf9cbde53970c8c57cd58cc82b004"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.980669 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54c74bff69-478cc" event={"ID":"55875815-4467-4c5e-8401-b220cb1694c6","Type":"ContainerStarted","Data":"6e7efd075b8caa213213254ce772e9cdc3a03053630e818d55d18d638e0fa013"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.980682 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lqwp9" event={"ID":"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab","Type":"ContainerStarted","Data":"66d7bb6bd665ede20693578acfb774acb7111749bd1bcf20640facf724763981"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.980697 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599b4448-86g7s" event={"ID":"da8c023e-1cf1-4a06-8c20-2b79612f7ae8","Type":"ContainerStarted","Data":"1d5221b13018a2c6a52f0ebbedb47f55d7e2b88491943fa9a31f5825b73e9573"} Mar 11 09:35:22 crc kubenswrapper[4830]: I0311 09:35:22.980709 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599b4448-86g7s" event={"ID":"da8c023e-1cf1-4a06-8c20-2b79612f7ae8","Type":"ContainerStarted","Data":"93528af15bb3b2e63bd029889ced0f43cdf3370eee843bb3a933931f1279c019"} Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.116092 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lqwp9" podStartSLOduration=3.443591897 podStartE2EDuration="50.116059888s" podCreationTimestamp="2026-03-11 09:34:33 +0000 UTC" firstStartedPulling="2026-03-11 09:34:34.694402631 +0000 UTC m=+1242.475553320" lastFinishedPulling="2026-03-11 09:35:21.366870622 +0000 UTC m=+1289.148021311" observedRunningTime="2026-03-11 09:35:23.103570772 +0000 UTC m=+1290.884721461" watchObservedRunningTime="2026-03-11 09:35:23.116059888 +0000 UTC m=+1290.897210587" Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.147814 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-599b4448-86g7s" podStartSLOduration=5.147775498 podStartE2EDuration="5.147775498s" podCreationTimestamp="2026-03-11 09:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:23.138532991 +0000 UTC m=+1290.919683680" watchObservedRunningTime="2026-03-11 09:35:23.147775498 +0000 UTC m=+1290.928926187" Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.976054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-599b4448-86g7s" event={"ID":"da8c023e-1cf1-4a06-8c20-2b79612f7ae8","Type":"ContainerStarted","Data":"8e9b8282e135bd38a201c3bcb60a686e14437834b5774f64736c575ecdcd089d"} Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.976393 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.976412 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.978498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586c785596-k4qp7" event={"ID":"af048295-8bc1-42cb-8f67-3049b2dc4215","Type":"ContainerStarted","Data":"6c94df8ed2b30d7dd1549194d3511d02e42eb5b3aa1da0ebf8f3399a63de4d72"} Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.978634 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.982754 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" event={"ID":"addc50b8-ce68-4690-9979-07b2f596215d","Type":"ContainerStarted","Data":"5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524"} Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.982937 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.986075 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54c74bff69-478cc" event={"ID":"55875815-4467-4c5e-8401-b220cb1694c6","Type":"ContainerStarted","Data":"fc07bbc133e735d71aa37892300ca27cfccf11a9c0d0cfa00b260e58ebcac0fe"} Mar 11 09:35:23 crc kubenswrapper[4830]: I0311 09:35:23.986394 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:24 crc kubenswrapper[4830]: I0311 09:35:24.006678 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pl7d8" podStartSLOduration=4.285278632 podStartE2EDuration="51.00665261s" podCreationTimestamp="2026-03-11 09:34:33 +0000 UTC" firstStartedPulling="2026-03-11 09:34:34.743143343 +0000 UTC m=+1242.524294032" lastFinishedPulling="2026-03-11 09:35:21.464517311 +0000 UTC m=+1289.245668010" observedRunningTime="2026-03-11 09:35:23.161465358 +0000 UTC m=+1290.942616057" watchObservedRunningTime="2026-03-11 09:35:24.00665261 +0000 UTC m=+1291.787803299" Mar 11 09:35:24 crc kubenswrapper[4830]: I0311 09:35:24.038278 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-586c785596-k4qp7" podStartSLOduration=6.038244307 podStartE2EDuration="6.038244307s" podCreationTimestamp="2026-03-11 09:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:24.011466404 +0000 UTC m=+1291.792617103" watchObservedRunningTime="2026-03-11 09:35:24.038244307 +0000 UTC m=+1291.819394996" Mar 11 09:35:24 crc kubenswrapper[4830]: I0311 09:35:24.041960 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54c74bff69-478cc" podStartSLOduration=4.041930149 podStartE2EDuration="4.041930149s" podCreationTimestamp="2026-03-11 09:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:24.033597528 +0000 UTC m=+1291.814748217" watchObservedRunningTime="2026-03-11 09:35:24.041930149 +0000 UTC m=+1291.823080838" Mar 11 09:35:24 crc kubenswrapper[4830]: I0311 09:35:24.062438 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" podStartSLOduration=6.062403927 podStartE2EDuration="6.062403927s" podCreationTimestamp="2026-03-11 09:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:24.053451339 +0000 UTC m=+1291.834602038" watchObservedRunningTime="2026-03-11 09:35:24.062403927 +0000 UTC m=+1291.843554626" Mar 11 09:35:26 crc kubenswrapper[4830]: I0311 09:35:26.005610 4830 generic.go:334] "Generic (PLEG): container finished" podID="045501ed-58bb-4a38-9b4a-5091217cf610" containerID="4b8a3a83ef1b2c06935e0d9d5e696aca65f4f0b637c4f517ce8592bd55f0f7fd" exitCode=0 Mar 11 09:35:26 crc kubenswrapper[4830]: I0311 09:35:26.005666 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl7d8" event={"ID":"045501ed-58bb-4a38-9b4a-5091217cf610","Type":"ContainerDied","Data":"4b8a3a83ef1b2c06935e0d9d5e696aca65f4f0b637c4f517ce8592bd55f0f7fd"} Mar 11 09:35:28 crc kubenswrapper[4830]: I0311 09:35:28.028402 4830 generic.go:334] "Generic (PLEG): container finished" podID="fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" containerID="66d7bb6bd665ede20693578acfb774acb7111749bd1bcf20640facf724763981" exitCode=0 Mar 11 09:35:28 crc kubenswrapper[4830]: I0311 09:35:28.028493 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lqwp9" event={"ID":"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab","Type":"ContainerDied","Data":"66d7bb6bd665ede20693578acfb774acb7111749bd1bcf20640facf724763981"} Mar 11 09:35:29 crc kubenswrapper[4830]: I0311 09:35:29.011163 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:29 crc kubenswrapper[4830]: I0311 09:35:29.088294 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-4m7pp"] Mar 11 09:35:29 crc kubenswrapper[4830]: I0311 09:35:29.088557 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerName="dnsmasq-dns" containerID="cri-o://9090e448f5755db665471c0ce3878c3de13bad11806387d0acdad4a261dcab75" gracePeriod=10 Mar 11 09:35:29 crc kubenswrapper[4830]: I0311 09:35:29.201375 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Mar 11 09:35:29 crc kubenswrapper[4830]: I0311 09:35:29.855843 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.023786 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hl6t\" (UniqueName: \"kubernetes.io/projected/045501ed-58bb-4a38-9b4a-5091217cf610-kube-api-access-2hl6t\") pod \"045501ed-58bb-4a38-9b4a-5091217cf610\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.024078 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-combined-ca-bundle\") pod \"045501ed-58bb-4a38-9b4a-5091217cf610\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.024236 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-db-sync-config-data\") pod \"045501ed-58bb-4a38-9b4a-5091217cf610\" (UID: \"045501ed-58bb-4a38-9b4a-5091217cf610\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.031546 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "045501ed-58bb-4a38-9b4a-5091217cf610" (UID: "045501ed-58bb-4a38-9b4a-5091217cf610"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.034166 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/045501ed-58bb-4a38-9b4a-5091217cf610-kube-api-access-2hl6t" (OuterVolumeSpecName: "kube-api-access-2hl6t") pod "045501ed-58bb-4a38-9b4a-5091217cf610" (UID: "045501ed-58bb-4a38-9b4a-5091217cf610"). InnerVolumeSpecName "kube-api-access-2hl6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.058895 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "045501ed-58bb-4a38-9b4a-5091217cf610" (UID: "045501ed-58bb-4a38-9b4a-5091217cf610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.081089 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pl7d8" event={"ID":"045501ed-58bb-4a38-9b4a-5091217cf610","Type":"ContainerDied","Data":"8767287839fccd93826ffb117034d90549292ebbb319c1bc3e130e845d9f1f1a"} Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.081138 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8767287839fccd93826ffb117034d90549292ebbb319c1bc3e130e845d9f1f1a" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.081267 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pl7d8" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.092928 4830 generic.go:334] "Generic (PLEG): container finished" podID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerID="9090e448f5755db665471c0ce3878c3de13bad11806387d0acdad4a261dcab75" exitCode=0 Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.092978 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" event={"ID":"a4a68d64-f644-4e4a-a216-af618d1883c8","Type":"ContainerDied","Data":"9090e448f5755db665471c0ce3878c3de13bad11806387d0acdad4a261dcab75"} Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.129515 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hl6t\" (UniqueName: \"kubernetes.io/projected/045501ed-58bb-4a38-9b4a-5091217cf610-kube-api-access-2hl6t\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.129542 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.129552 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/045501ed-58bb-4a38-9b4a-5091217cf610-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.612067 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.620058 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740565 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-combined-ca-bundle\") pod \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740631 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-sb\") pod \"a4a68d64-f644-4e4a-a216-af618d1883c8\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740663 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-etc-machine-id\") pod \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740740 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-dns-svc\") pod \"a4a68d64-f644-4e4a-a216-af618d1883c8\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740777 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bvd7\" (UniqueName: \"kubernetes.io/projected/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-kube-api-access-5bvd7\") pod \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740806 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-config-data\") pod \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740841 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-scripts\") pod \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740855 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-db-sync-config-data\") pod \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\" (UID: \"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740899 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-config\") pod \"a4a68d64-f644-4e4a-a216-af618d1883c8\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740915 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs7mx\" (UniqueName: \"kubernetes.io/projected/a4a68d64-f644-4e4a-a216-af618d1883c8-kube-api-access-xs7mx\") pod \"a4a68d64-f644-4e4a-a216-af618d1883c8\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.740939 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-nb\") pod \"a4a68d64-f644-4e4a-a216-af618d1883c8\" (UID: \"a4a68d64-f644-4e4a-a216-af618d1883c8\") " Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.741911 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" (UID: "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.745779 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-scripts" (OuterVolumeSpecName: "scripts") pod "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" (UID: "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.756046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" (UID: "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.760118 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a68d64-f644-4e4a-a216-af618d1883c8-kube-api-access-xs7mx" (OuterVolumeSpecName: "kube-api-access-xs7mx") pod "a4a68d64-f644-4e4a-a216-af618d1883c8" (UID: "a4a68d64-f644-4e4a-a216-af618d1883c8"). InnerVolumeSpecName "kube-api-access-xs7mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.781318 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-kube-api-access-5bvd7" (OuterVolumeSpecName: "kube-api-access-5bvd7") pod "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" (UID: "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab"). InnerVolumeSpecName "kube-api-access-5bvd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.784659 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" (UID: "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.798381 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-config-data" (OuterVolumeSpecName: "config-data") pod "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" (UID: "fffcbc2c-4845-4a9d-8709-45eb4a28f0ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.819034 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4a68d64-f644-4e4a-a216-af618d1883c8" (UID: "a4a68d64-f644-4e4a-a216-af618d1883c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.819161 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-config" (OuterVolumeSpecName: "config") pod "a4a68d64-f644-4e4a-a216-af618d1883c8" (UID: "a4a68d64-f644-4e4a-a216-af618d1883c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.832502 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4a68d64-f644-4e4a-a216-af618d1883c8" (UID: "a4a68d64-f644-4e4a-a216-af618d1883c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.833011 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4a68d64-f644-4e4a-a216-af618d1883c8" (UID: "a4a68d64-f644-4e4a-a216-af618d1883c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843119 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843323 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843428 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843543 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843626 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bvd7\" (UniqueName: \"kubernetes.io/projected/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-kube-api-access-5bvd7\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843710 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843782 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843845 4830 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.843910 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.844041 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs7mx\" (UniqueName: \"kubernetes.io/projected/a4a68d64-f644-4e4a-a216-af618d1883c8-kube-api-access-xs7mx\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:30 crc kubenswrapper[4830]: I0311 09:35:30.844113 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4a68d64-f644-4e4a-a216-af618d1883c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.166402 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" event={"ID":"a4a68d64-f644-4e4a-a216-af618d1883c8","Type":"ContainerDied","Data":"0ccdde2761a90b9ccc3eb226cf50fc64f4f3a2feb6aca61ecf1ce54e12f66fd9"} Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.166777 4830 scope.go:117] "RemoveContainer" containerID="9090e448f5755db665471c0ce3878c3de13bad11806387d0acdad4a261dcab75" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.166926 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56798b757f-4m7pp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181221 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c976ddb9d-ppssd"] Mar 11 09:35:31 crc kubenswrapper[4830]: E0311 09:35:31.181571 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerName="init" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181583 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerName="init" Mar 11 09:35:31 crc kubenswrapper[4830]: E0311 09:35:31.181599 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerName="dnsmasq-dns" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181606 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerName="dnsmasq-dns" Mar 11 09:35:31 crc kubenswrapper[4830]: E0311 09:35:31.181624 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerName="dnsmasq-dns" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181630 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerName="dnsmasq-dns" Mar 11 09:35:31 crc kubenswrapper[4830]: E0311 09:35:31.181653 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045501ed-58bb-4a38-9b4a-5091217cf610" containerName="barbican-db-sync" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181660 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="045501ed-58bb-4a38-9b4a-5091217cf610" containerName="barbican-db-sync" Mar 11 09:35:31 crc kubenswrapper[4830]: E0311 09:35:31.181680 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" containerName="cinder-db-sync" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181688 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" containerName="cinder-db-sync" Mar 11 09:35:31 crc kubenswrapper[4830]: E0311 09:35:31.181695 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerName="init" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181702 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerName="init" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181851 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" containerName="dnsmasq-dns" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181866 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="045501ed-58bb-4a38-9b4a-5091217cf610" containerName="barbican-db-sync" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181875 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" containerName="cinder-db-sync" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.181897 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="99501380-81f7-4f53-8b8e-ca9e4cd51567" containerName="dnsmasq-dns" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.182808 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.188384 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lqwp9" event={"ID":"fffcbc2c-4845-4a9d-8709-45eb4a28f0ab","Type":"ContainerDied","Data":"d71895e5b9d5280b4ebb7c4ffd9eb57c474d362943489245698be9c2012895f5"} Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.188435 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71895e5b9d5280b4ebb7c4ffd9eb57c474d362943489245698be9c2012895f5" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.188518 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lqwp9" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.190126 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.190350 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.190415 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-lwgss" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.190743 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6bddfb9bc9-6hzsp"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.192816 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.206720 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.212496 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bddfb9bc9-6hzsp"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.224204 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c976ddb9d-ppssd"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.256190 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-4m7pp"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.287896 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56798b757f-4m7pp"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.370456 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kxfpv"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.372031 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373484 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-config-data-custom\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373563 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8wxh\" (UniqueName: \"kubernetes.io/projected/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-kube-api-access-n8wxh\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcnmq\" (UniqueName: \"kubernetes.io/projected/98ddc718-e67e-406f-aae3-03680232691b-kube-api-access-tcnmq\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373647 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-config-data\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373706 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-config-data-custom\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373732 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-logs\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-combined-ca-bundle\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373860 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98ddc718-e67e-406f-aae3-03680232691b-logs\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-combined-ca-bundle\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.373917 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-config-data\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.433109 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kxfpv"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495332 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-config-data\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495377 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-config-data-custom\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495408 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495443 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz8bh\" (UniqueName: \"kubernetes.io/projected/2c4727f8-8456-4d35-9850-f3187aa4e9b4-kube-api-access-rz8bh\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495460 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495482 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8wxh\" (UniqueName: \"kubernetes.io/projected/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-kube-api-access-n8wxh\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495501 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcnmq\" (UniqueName: \"kubernetes.io/projected/98ddc718-e67e-406f-aae3-03680232691b-kube-api-access-tcnmq\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495523 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-config-data\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495545 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-config-data-custom\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495563 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-logs\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495587 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-combined-ca-bundle\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495605 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-config\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495661 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98ddc718-e67e-406f-aae3-03680232691b-logs\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495686 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-combined-ca-bundle\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.495709 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.497290 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98ddc718-e67e-406f-aae3-03680232691b-logs\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.498339 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-logs\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.498651 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f556978d6-swcm4"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.500761 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-combined-ca-bundle\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.501701 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-config-data\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.503618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-config-data-custom\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.503697 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.504369 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-config-data\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.507500 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.507977 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ddc718-e67e-406f-aae3-03680232691b-combined-ca-bundle\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.528311 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcnmq\" (UniqueName: \"kubernetes.io/projected/98ddc718-e67e-406f-aae3-03680232691b-kube-api-access-tcnmq\") pod \"barbican-keystone-listener-7c976ddb9d-ppssd\" (UID: \"98ddc718-e67e-406f-aae3-03680232691b\") " pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.529778 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-config-data-custom\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.531161 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f556978d6-swcm4"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.531683 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8wxh\" (UniqueName: \"kubernetes.io/projected/b573b144-d9a4-4ea5-8b28-d9e4e3ed6274-kube-api-access-n8wxh\") pod \"barbican-worker-6bddfb9bc9-6hzsp\" (UID: \"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274\") " pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.548717 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.581157 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.600901 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.600999 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz8bh\" (UniqueName: \"kubernetes.io/projected/2c4727f8-8456-4d35-9850-f3187aa4e9b4-kube-api-access-rz8bh\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.601087 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.601179 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-config\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.601228 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.601295 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.602196 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-svc\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.602240 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.602755 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.603298 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.603884 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-config\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.619856 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz8bh\" (UniqueName: \"kubernetes.io/projected/2c4727f8-8456-4d35-9850-f3187aa4e9b4-kube-api-access-rz8bh\") pod \"dnsmasq-dns-688c87cc99-kxfpv\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.705400 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-combined-ca-bundle\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.705447 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfmb5\" (UniqueName: \"kubernetes.io/projected/41520592-8a80-47a4-a85c-6372ad2b5d28-kube-api-access-cfmb5\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.705482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.705577 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41520592-8a80-47a4-a85c-6372ad2b5d28-logs\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.705626 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data-custom\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.718160 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.772148 4830 scope.go:117] "RemoveContainer" containerID="77e2aaaf0900d73db61d6bf3dd9eaa6395415af722d9b32933901404a2116700" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.807281 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-combined-ca-bundle\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.807323 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfmb5\" (UniqueName: \"kubernetes.io/projected/41520592-8a80-47a4-a85c-6372ad2b5d28-kube-api-access-cfmb5\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.807371 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.807418 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data-custom\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.807442 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41520592-8a80-47a4-a85c-6372ad2b5d28-logs\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.808267 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41520592-8a80-47a4-a85c-6372ad2b5d28-logs\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.812195 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-combined-ca-bundle\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.814169 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data-custom\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.815340 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.832731 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfmb5\" (UniqueName: \"kubernetes.io/projected/41520592-8a80-47a4-a85c-6372ad2b5d28-kube-api-access-cfmb5\") pod \"barbican-api-6f556978d6-swcm4\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.903486 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.953476 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.955215 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.962706 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.962984 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.963152 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mpg29" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.963796 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 09:35:31 crc kubenswrapper[4830]: I0311 09:35:31.993235 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.016280 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97gp\" (UniqueName: \"kubernetes.io/projected/5484cee6-2855-4f08-a439-991e26f49c1f-kube-api-access-d97gp\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.016345 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5484cee6-2855-4f08-a439-991e26f49c1f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.016391 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.016446 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.016519 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.016541 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.022816 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5f6b87df74-q5t2v" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.111488 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kxfpv"] Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.118241 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97gp\" (UniqueName: \"kubernetes.io/projected/5484cee6-2855-4f08-a439-991e26f49c1f-kube-api-access-d97gp\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.118304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5484cee6-2855-4f08-a439-991e26f49c1f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.118330 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.118378 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.118420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.118437 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.124675 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5484cee6-2855-4f08-a439-991e26f49c1f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.126226 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.128434 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.138359 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.158895 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.159692 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97gp\" (UniqueName: \"kubernetes.io/projected/5484cee6-2855-4f08-a439-991e26f49c1f-kube-api-access-d97gp\") pod \"cinder-scheduler-0\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.180741 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-2xcmq"] Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.183322 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.222174 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.222276 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.222311 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-config\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.222374 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.222473 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4d5b\" (UniqueName: \"kubernetes.io/projected/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-kube-api-access-h4d5b\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.222509 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.250592 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-2xcmq"] Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.314828 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.316583 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.326756 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.334432 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.336281 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4d5b\" (UniqueName: \"kubernetes.io/projected/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-kube-api-access-h4d5b\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.336371 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.336424 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.336506 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.336540 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-config\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.336626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.339613 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.339686 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.340104 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.341115 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.358127 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-config\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.338010 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.359229 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-789dc4b6cd-xz7ds" podUID="77e86c78-b565-4e6c-8867-519fa2d5137a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.395912 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4d5b\" (UniqueName: \"kubernetes.io/projected/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-kube-api-access-h4d5b\") pod \"dnsmasq-dns-6bb4fc677f-2xcmq\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.438649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mw8s\" (UniqueName: \"kubernetes.io/projected/785431d6-0213-4c97-a8d8-dad66e686f70-kube-api-access-8mw8s\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.438714 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.438743 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data-custom\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.438768 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-scripts\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.438795 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/785431d6-0213-4c97-a8d8-dad66e686f70-etc-machine-id\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.438837 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785431d6-0213-4c97-a8d8-dad66e686f70-logs\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.438890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.547742 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.548248 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mw8s\" (UniqueName: \"kubernetes.io/projected/785431d6-0213-4c97-a8d8-dad66e686f70-kube-api-access-8mw8s\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.548333 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.548384 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data-custom\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.548438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-scripts\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.548556 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/785431d6-0213-4c97-a8d8-dad66e686f70-etc-machine-id\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.548845 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785431d6-0213-4c97-a8d8-dad66e686f70-logs\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.549456 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785431d6-0213-4c97-a8d8-dad66e686f70-logs\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.552492 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/785431d6-0213-4c97-a8d8-dad66e686f70-etc-machine-id\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.557178 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-scripts\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.558430 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.561647 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.566596 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.577768 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data-custom\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.594852 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mw8s\" (UniqueName: \"kubernetes.io/projected/785431d6-0213-4c97-a8d8-dad66e686f70-kube-api-access-8mw8s\") pod \"cinder-api-0\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: E0311 09:35:32.704653 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.817167 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.915192 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bddfb9bc9-6hzsp"] Mar 11 09:35:32 crc kubenswrapper[4830]: I0311 09:35:32.966688 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a68d64-f644-4e4a-a216-af618d1883c8" path="/var/lib/kubelet/pods/a4a68d64-f644-4e4a-a216-af618d1883c8/volumes" Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.116966 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.127863 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f556978d6-swcm4"] Mar 11 09:35:33 crc kubenswrapper[4830]: W0311 09:35:33.134033 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41520592_8a80_47a4_a85c_6372ad2b5d28.slice/crio-b4213d17044e3c4d1f7306335653867de28a8f2a64e321c98d65e53d3055c9f6 WatchSource:0}: Error finding container b4213d17044e3c4d1f7306335653867de28a8f2a64e321c98d65e53d3055c9f6: Status 404 returned error can't find the container with id b4213d17044e3c4d1f7306335653867de28a8f2a64e321c98d65e53d3055c9f6 Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.138953 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c976ddb9d-ppssd"] Mar 11 09:35:33 crc kubenswrapper[4830]: W0311 09:35:33.147628 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5484cee6_2855_4f08_a439_991e26f49c1f.slice/crio-42525cd884b1bf0fb02628686a268883e52602688c561c25cb76a2423d472b4a WatchSource:0}: Error finding container 42525cd884b1bf0fb02628686a268883e52602688c561c25cb76a2423d472b4a: Status 404 returned error can't find the container with id 42525cd884b1bf0fb02628686a268883e52602688c561c25cb76a2423d472b4a Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.152415 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kxfpv"] Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.331622 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-2xcmq"] Mar 11 09:35:33 crc kubenswrapper[4830]: W0311 09:35:33.343188 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7bd13ed_7aaf_420d_8ac8_90ac86e21cc7.slice/crio-7586a33145077135f1215537f24c81e3374ebc0285a7e180064da195b29389d5 WatchSource:0}: Error finding container 7586a33145077135f1215537f24c81e3374ebc0285a7e180064da195b29389d5: Status 404 returned error can't find the container with id 7586a33145077135f1215537f24c81e3374ebc0285a7e180064da195b29389d5 Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.381471 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" event={"ID":"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274","Type":"ContainerStarted","Data":"10758df6ff8a94956cc1b15aa4ef954d170a2f6d2fab48a498751a1ca370a25d"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.387755 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" event={"ID":"98ddc718-e67e-406f-aae3-03680232691b","Type":"ContainerStarted","Data":"ae1931d354a91a475adb3d38f49848a4df9477a8891a1df2715694f781cfd5ba"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.393867 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" event={"ID":"2c4727f8-8456-4d35-9850-f3187aa4e9b4","Type":"ContainerStarted","Data":"5775c342154190cd38c87ec7148106756296e7f022b5d79f381301dc9a186a41"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.405849 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerStarted","Data":"d59435a6a215394d33efa7749c6f841eb96f513c87dc5e63c06ce2820c9e5d2b"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.405954 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="ceilometer-notification-agent" containerID="cri-o://535fb6734fa534d36c37e754401c45c8c8f4f668b2eeb53e3653430c917a55fb" gracePeriod=30 Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.406107 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.406152 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="proxy-httpd" containerID="cri-o://d59435a6a215394d33efa7749c6f841eb96f513c87dc5e63c06ce2820c9e5d2b" gracePeriod=30 Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.406204 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="sg-core" containerID="cri-o://b63752527d4b2ca0b545c57d7b5c52f9778338607a2d1c8e1c1834b7d5c5b827" gracePeriod=30 Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.408980 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" event={"ID":"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7","Type":"ContainerStarted","Data":"7586a33145077135f1215537f24c81e3374ebc0285a7e180064da195b29389d5"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.415795 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f556978d6-swcm4" event={"ID":"41520592-8a80-47a4-a85c-6372ad2b5d28","Type":"ContainerStarted","Data":"25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.415846 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f556978d6-swcm4" event={"ID":"41520592-8a80-47a4-a85c-6372ad2b5d28","Type":"ContainerStarted","Data":"b4213d17044e3c4d1f7306335653867de28a8f2a64e321c98d65e53d3055c9f6"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.418100 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5484cee6-2855-4f08-a439-991e26f49c1f","Type":"ContainerStarted","Data":"42525cd884b1bf0fb02628686a268883e52602688c561c25cb76a2423d472b4a"} Mar 11 09:35:33 crc kubenswrapper[4830]: I0311 09:35:33.449416 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.428626 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"785431d6-0213-4c97-a8d8-dad66e686f70","Type":"ContainerStarted","Data":"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310"} Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.429272 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"785431d6-0213-4c97-a8d8-dad66e686f70","Type":"ContainerStarted","Data":"e142470ef04447a5adb0129dfe2a923d123d03618c1d6e1f042f9dc68d54d16c"} Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.433088 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f556978d6-swcm4" event={"ID":"41520592-8a80-47a4-a85c-6372ad2b5d28","Type":"ContainerStarted","Data":"a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8"} Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.433191 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.433453 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.436059 4830 generic.go:334] "Generic (PLEG): container finished" podID="2c4727f8-8456-4d35-9850-f3187aa4e9b4" containerID="dcd629c7d737d3748c4ff108864aaa85eecda9f466d912fe0511ab3d0fc8b25b" exitCode=0 Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.436150 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" event={"ID":"2c4727f8-8456-4d35-9850-f3187aa4e9b4","Type":"ContainerDied","Data":"dcd629c7d737d3748c4ff108864aaa85eecda9f466d912fe0511ab3d0fc8b25b"} Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.446658 4830 generic.go:334] "Generic (PLEG): container finished" podID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerID="d59435a6a215394d33efa7749c6f841eb96f513c87dc5e63c06ce2820c9e5d2b" exitCode=0 Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.446710 4830 generic.go:334] "Generic (PLEG): container finished" podID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerID="b63752527d4b2ca0b545c57d7b5c52f9778338607a2d1c8e1c1834b7d5c5b827" exitCode=2 Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.446777 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerDied","Data":"d59435a6a215394d33efa7749c6f841eb96f513c87dc5e63c06ce2820c9e5d2b"} Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.446807 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerDied","Data":"b63752527d4b2ca0b545c57d7b5c52f9778338607a2d1c8e1c1834b7d5c5b827"} Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.448596 4830 generic.go:334] "Generic (PLEG): container finished" podID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerID="de8f478e3d286ad249d1f4c3460d16d8d1a5e2a07f31ff7c5562f44099f0a087" exitCode=0 Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.448626 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" event={"ID":"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7","Type":"ContainerDied","Data":"de8f478e3d286ad249d1f4c3460d16d8d1a5e2a07f31ff7c5562f44099f0a087"} Mar 11 09:35:34 crc kubenswrapper[4830]: I0311 09:35:34.465612 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f556978d6-swcm4" podStartSLOduration=3.465591123 podStartE2EDuration="3.465591123s" podCreationTimestamp="2026-03-11 09:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:34.460493122 +0000 UTC m=+1302.241643821" watchObservedRunningTime="2026-03-11 09:35:34.465591123 +0000 UTC m=+1302.246741812" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.398645 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.428161 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.478248 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" event={"ID":"2c4727f8-8456-4d35-9850-f3187aa4e9b4","Type":"ContainerDied","Data":"5775c342154190cd38c87ec7148106756296e7f022b5d79f381301dc9a186a41"} Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.478344 4830 scope.go:117] "RemoveContainer" containerID="dcd629c7d737d3748c4ff108864aaa85eecda9f466d912fe0511ab3d0fc8b25b" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.478258 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-kxfpv" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.486398 4830 generic.go:334] "Generic (PLEG): container finished" podID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerID="535fb6734fa534d36c37e754401c45c8c8f4f668b2eeb53e3653430c917a55fb" exitCode=0 Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.487377 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerDied","Data":"535fb6734fa534d36c37e754401c45c8c8f4f668b2eeb53e3653430c917a55fb"} Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.524210 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-swift-storage-0\") pod \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.524274 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz8bh\" (UniqueName: \"kubernetes.io/projected/2c4727f8-8456-4d35-9850-f3187aa4e9b4-kube-api-access-rz8bh\") pod \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.524471 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-svc\") pod \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.524537 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-nb\") pod \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.524697 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-config\") pod \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.524735 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-sb\") pod \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\" (UID: \"2c4727f8-8456-4d35-9850-f3187aa4e9b4\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.565414 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4727f8-8456-4d35-9850-f3187aa4e9b4-kube-api-access-rz8bh" (OuterVolumeSpecName: "kube-api-access-rz8bh") pod "2c4727f8-8456-4d35-9850-f3187aa4e9b4" (UID: "2c4727f8-8456-4d35-9850-f3187aa4e9b4"). InnerVolumeSpecName "kube-api-access-rz8bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.575450 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c4727f8-8456-4d35-9850-f3187aa4e9b4" (UID: "2c4727f8-8456-4d35-9850-f3187aa4e9b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.600383 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c4727f8-8456-4d35-9850-f3187aa4e9b4" (UID: "2c4727f8-8456-4d35-9850-f3187aa4e9b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.610204 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c4727f8-8456-4d35-9850-f3187aa4e9b4" (UID: "2c4727f8-8456-4d35-9850-f3187aa4e9b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.627439 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.627480 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz8bh\" (UniqueName: \"kubernetes.io/projected/2c4727f8-8456-4d35-9850-f3187aa4e9b4-kube-api-access-rz8bh\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.627499 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.627511 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.664900 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c4727f8-8456-4d35-9850-f3187aa4e9b4" (UID: "2c4727f8-8456-4d35-9850-f3187aa4e9b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.679110 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-config" (OuterVolumeSpecName: "config") pod "2c4727f8-8456-4d35-9850-f3187aa4e9b4" (UID: "2c4727f8-8456-4d35-9850-f3187aa4e9b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.726278 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.729421 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.729451 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c4727f8-8456-4d35-9850-f3187aa4e9b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.831286 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-sg-core-conf-yaml\") pod \"bc3adf05-3cb9-4fda-be48-67b6b3084179\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.831366 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-scripts\") pod \"bc3adf05-3cb9-4fda-be48-67b6b3084179\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.831517 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-log-httpd\") pod \"bc3adf05-3cb9-4fda-be48-67b6b3084179\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.831574 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9r6v\" (UniqueName: \"kubernetes.io/projected/bc3adf05-3cb9-4fda-be48-67b6b3084179-kube-api-access-b9r6v\") pod \"bc3adf05-3cb9-4fda-be48-67b6b3084179\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.831634 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-config-data\") pod \"bc3adf05-3cb9-4fda-be48-67b6b3084179\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.831650 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-combined-ca-bundle\") pod \"bc3adf05-3cb9-4fda-be48-67b6b3084179\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.831692 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-run-httpd\") pod \"bc3adf05-3cb9-4fda-be48-67b6b3084179\" (UID: \"bc3adf05-3cb9-4fda-be48-67b6b3084179\") " Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.832246 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc3adf05-3cb9-4fda-be48-67b6b3084179" (UID: "bc3adf05-3cb9-4fda-be48-67b6b3084179"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.832756 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc3adf05-3cb9-4fda-be48-67b6b3084179" (UID: "bc3adf05-3cb9-4fda-be48-67b6b3084179"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.878327 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kxfpv"] Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.879439 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3adf05-3cb9-4fda-be48-67b6b3084179-kube-api-access-b9r6v" (OuterVolumeSpecName: "kube-api-access-b9r6v") pod "bc3adf05-3cb9-4fda-be48-67b6b3084179" (UID: "bc3adf05-3cb9-4fda-be48-67b6b3084179"). InnerVolumeSpecName "kube-api-access-b9r6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.881254 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-scripts" (OuterVolumeSpecName: "scripts") pod "bc3adf05-3cb9-4fda-be48-67b6b3084179" (UID: "bc3adf05-3cb9-4fda-be48-67b6b3084179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.891486 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-kxfpv"] Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.899772 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bc3adf05-3cb9-4fda-be48-67b6b3084179" (UID: "bc3adf05-3cb9-4fda-be48-67b6b3084179"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.934436 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.934480 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.934496 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9r6v\" (UniqueName: \"kubernetes.io/projected/bc3adf05-3cb9-4fda-be48-67b6b3084179-kube-api-access-b9r6v\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.934511 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc3adf05-3cb9-4fda-be48-67b6b3084179-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.934525 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:35 crc kubenswrapper[4830]: I0311 09:35:35.955919 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc3adf05-3cb9-4fda-be48-67b6b3084179" (UID: "bc3adf05-3cb9-4fda-be48-67b6b3084179"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.009417 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-config-data" (OuterVolumeSpecName: "config-data") pod "bc3adf05-3cb9-4fda-be48-67b6b3084179" (UID: "bc3adf05-3cb9-4fda-be48-67b6b3084179"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.037286 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.037315 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3adf05-3cb9-4fda-be48-67b6b3084179-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.497917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"785431d6-0213-4c97-a8d8-dad66e686f70","Type":"ContainerStarted","Data":"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.498315 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.498265 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api" containerID="cri-o://0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583" gracePeriod=30 Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.498035 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api-log" containerID="cri-o://480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310" gracePeriod=30 Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.504237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5484cee6-2855-4f08-a439-991e26f49c1f","Type":"ContainerStarted","Data":"4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.504277 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5484cee6-2855-4f08-a439-991e26f49c1f","Type":"ContainerStarted","Data":"0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.512193 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" event={"ID":"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274","Type":"ContainerStarted","Data":"6d6b8abe39c528aa8c4654993db02c9a10efeff38c00699e5f5413da1be6b70a"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.512236 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" event={"ID":"b573b144-d9a4-4ea5-8b28-d9e4e3ed6274","Type":"ContainerStarted","Data":"b73d89c1fb526a60ff1320cb80764036a25adaa6ad64bfd848356d22da8fc43c"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.521438 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" event={"ID":"98ddc718-e67e-406f-aae3-03680232691b","Type":"ContainerStarted","Data":"bb2e34428f6bdaa92daaa3f661003da767a1dc670b796feedf3e1edbe56cfae5"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.522365 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" event={"ID":"98ddc718-e67e-406f-aae3-03680232691b","Type":"ContainerStarted","Data":"b9c2611b83e0b0b29a8709a92dc2cd1018db99360780ed493d9d61240ec889f8"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.533580 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc3adf05-3cb9-4fda-be48-67b6b3084179","Type":"ContainerDied","Data":"6253530cd7ec576fa880a84675bc596e10ff1c23e2b88e95d5243d01305a9d37"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.533660 4830 scope.go:117] "RemoveContainer" containerID="d59435a6a215394d33efa7749c6f841eb96f513c87dc5e63c06ce2820c9e5d2b" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.533659 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.540634 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.540611172 podStartE2EDuration="4.540611172s" podCreationTimestamp="2026-03-11 09:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:36.524376122 +0000 UTC m=+1304.305526831" watchObservedRunningTime="2026-03-11 09:35:36.540611172 +0000 UTC m=+1304.321761861" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.542759 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" event={"ID":"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7","Type":"ContainerStarted","Data":"f45b5be0bef291446fd5d3bd7d090b740138b75f83e037bab72b907404b42ef6"} Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.543781 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.547920 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6bddfb9bc9-6hzsp" podStartSLOduration=3.198730839 podStartE2EDuration="5.547901725s" podCreationTimestamp="2026-03-11 09:35:31 +0000 UTC" firstStartedPulling="2026-03-11 09:35:32.926128042 +0000 UTC m=+1300.707278731" lastFinishedPulling="2026-03-11 09:35:35.275298928 +0000 UTC m=+1303.056449617" observedRunningTime="2026-03-11 09:35:36.545587111 +0000 UTC m=+1304.326737820" watchObservedRunningTime="2026-03-11 09:35:36.547901725 +0000 UTC m=+1304.329052444" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.574058 4830 scope.go:117] "RemoveContainer" containerID="b63752527d4b2ca0b545c57d7b5c52f9778338607a2d1c8e1c1834b7d5c5b827" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.585149 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c976ddb9d-ppssd" podStartSLOduration=3.428747772 podStartE2EDuration="5.584985464s" podCreationTimestamp="2026-03-11 09:35:31 +0000 UTC" firstStartedPulling="2026-03-11 09:35:33.171311995 +0000 UTC m=+1300.952462684" lastFinishedPulling="2026-03-11 09:35:35.327549687 +0000 UTC m=+1303.108700376" observedRunningTime="2026-03-11 09:35:36.574910194 +0000 UTC m=+1304.356060883" watchObservedRunningTime="2026-03-11 09:35:36.584985464 +0000 UTC m=+1304.366136153" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.597842 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.607103063 podStartE2EDuration="5.59782219s" podCreationTimestamp="2026-03-11 09:35:31 +0000 UTC" firstStartedPulling="2026-03-11 09:35:33.156203515 +0000 UTC m=+1300.937354214" lastFinishedPulling="2026-03-11 09:35:34.146922652 +0000 UTC m=+1301.928073341" observedRunningTime="2026-03-11 09:35:36.596509843 +0000 UTC m=+1304.377660542" watchObservedRunningTime="2026-03-11 09:35:36.59782219 +0000 UTC m=+1304.378972879" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.646096 4830 scope.go:117] "RemoveContainer" containerID="535fb6734fa534d36c37e754401c45c8c8f4f668b2eeb53e3653430c917a55fb" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.675557 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.691133 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706188 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:36 crc kubenswrapper[4830]: E0311 09:35:36.706633 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="ceilometer-notification-agent" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706651 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="ceilometer-notification-agent" Mar 11 09:35:36 crc kubenswrapper[4830]: E0311 09:35:36.706662 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="proxy-httpd" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706669 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="proxy-httpd" Mar 11 09:35:36 crc kubenswrapper[4830]: E0311 09:35:36.706705 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c4727f8-8456-4d35-9850-f3187aa4e9b4" containerName="init" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706711 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4727f8-8456-4d35-9850-f3187aa4e9b4" containerName="init" Mar 11 09:35:36 crc kubenswrapper[4830]: E0311 09:35:36.706730 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="sg-core" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706743 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="sg-core" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706912 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="sg-core" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706929 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="ceilometer-notification-agent" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706941 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" containerName="proxy-httpd" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.706960 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c4727f8-8456-4d35-9850-f3187aa4e9b4" containerName="init" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.708567 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.712900 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.713309 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.718904 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" podStartSLOduration=4.718880649 podStartE2EDuration="4.718880649s" podCreationTimestamp="2026-03-11 09:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:36.682287194 +0000 UTC m=+1304.463437903" watchObservedRunningTime="2026-03-11 09:35:36.718880649 +0000 UTC m=+1304.500031338" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.739259 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.869614 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-run-httpd\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.869722 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-config-data\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.869783 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvt5\" (UniqueName: \"kubernetes.io/projected/84753481-6831-4cec-8e80-e0141fdad711-kube-api-access-hpvt5\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.869811 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.869851 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-log-httpd\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.869874 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.869915 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-scripts\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.953385 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4727f8-8456-4d35-9850-f3187aa4e9b4" path="/var/lib/kubelet/pods/2c4727f8-8456-4d35-9850-f3187aa4e9b4/volumes" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.958406 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3adf05-3cb9-4fda-be48-67b6b3084179" path="/var/lib/kubelet/pods/bc3adf05-3cb9-4fda-be48-67b6b3084179/volumes" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.975293 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-run-httpd\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.975742 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-run-httpd\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.975954 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-config-data\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.976036 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvt5\" (UniqueName: \"kubernetes.io/projected/84753481-6831-4cec-8e80-e0141fdad711-kube-api-access-hpvt5\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.976072 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.976114 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-log-httpd\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.976130 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.976169 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-scripts\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.977538 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-log-httpd\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.985812 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.985995 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-scripts\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.986389 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-config-data\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.989055 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:36 crc kubenswrapper[4830]: I0311 09:35:36.994601 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvt5\" (UniqueName: \"kubernetes.io/projected/84753481-6831-4cec-8e80-e0141fdad711-kube-api-access-hpvt5\") pod \"ceilometer-0\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " pod="openstack/ceilometer-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.031614 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.246534 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.340483 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.383211 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mw8s\" (UniqueName: \"kubernetes.io/projected/785431d6-0213-4c97-a8d8-dad66e686f70-kube-api-access-8mw8s\") pod \"785431d6-0213-4c97-a8d8-dad66e686f70\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.383334 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data-custom\") pod \"785431d6-0213-4c97-a8d8-dad66e686f70\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.383373 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-combined-ca-bundle\") pod \"785431d6-0213-4c97-a8d8-dad66e686f70\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.383445 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/785431d6-0213-4c97-a8d8-dad66e686f70-etc-machine-id\") pod \"785431d6-0213-4c97-a8d8-dad66e686f70\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.383483 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785431d6-0213-4c97-a8d8-dad66e686f70-logs\") pod \"785431d6-0213-4c97-a8d8-dad66e686f70\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.383532 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-scripts\") pod \"785431d6-0213-4c97-a8d8-dad66e686f70\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.383780 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data\") pod \"785431d6-0213-4c97-a8d8-dad66e686f70\" (UID: \"785431d6-0213-4c97-a8d8-dad66e686f70\") " Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.389124 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/785431d6-0213-4c97-a8d8-dad66e686f70-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "785431d6-0213-4c97-a8d8-dad66e686f70" (UID: "785431d6-0213-4c97-a8d8-dad66e686f70"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.389855 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785431d6-0213-4c97-a8d8-dad66e686f70-logs" (OuterVolumeSpecName: "logs") pod "785431d6-0213-4c97-a8d8-dad66e686f70" (UID: "785431d6-0213-4c97-a8d8-dad66e686f70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.397183 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "785431d6-0213-4c97-a8d8-dad66e686f70" (UID: "785431d6-0213-4c97-a8d8-dad66e686f70"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.397215 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-scripts" (OuterVolumeSpecName: "scripts") pod "785431d6-0213-4c97-a8d8-dad66e686f70" (UID: "785431d6-0213-4c97-a8d8-dad66e686f70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.397272 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785431d6-0213-4c97-a8d8-dad66e686f70-kube-api-access-8mw8s" (OuterVolumeSpecName: "kube-api-access-8mw8s") pod "785431d6-0213-4c97-a8d8-dad66e686f70" (UID: "785431d6-0213-4c97-a8d8-dad66e686f70"). InnerVolumeSpecName "kube-api-access-8mw8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.424395 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "785431d6-0213-4c97-a8d8-dad66e686f70" (UID: "785431d6-0213-4c97-a8d8-dad66e686f70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.455780 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data" (OuterVolumeSpecName: "config-data") pod "785431d6-0213-4c97-a8d8-dad66e686f70" (UID: "785431d6-0213-4c97-a8d8-dad66e686f70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.491030 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/785431d6-0213-4c97-a8d8-dad66e686f70-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.491071 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/785431d6-0213-4c97-a8d8-dad66e686f70-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.491084 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.491095 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.491108 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mw8s\" (UniqueName: \"kubernetes.io/projected/785431d6-0213-4c97-a8d8-dad66e686f70-kube-api-access-8mw8s\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.491123 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.491135 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/785431d6-0213-4c97-a8d8-dad66e686f70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.562100 4830 generic.go:334] "Generic (PLEG): container finished" podID="785431d6-0213-4c97-a8d8-dad66e686f70" containerID="0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583" exitCode=0 Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.562144 4830 generic.go:334] "Generic (PLEG): container finished" podID="785431d6-0213-4c97-a8d8-dad66e686f70" containerID="480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310" exitCode=143 Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.562185 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.562212 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"785431d6-0213-4c97-a8d8-dad66e686f70","Type":"ContainerDied","Data":"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583"} Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.562280 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"785431d6-0213-4c97-a8d8-dad66e686f70","Type":"ContainerDied","Data":"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310"} Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.562299 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"785431d6-0213-4c97-a8d8-dad66e686f70","Type":"ContainerDied","Data":"e142470ef04447a5adb0129dfe2a923d123d03618c1d6e1f042f9dc68d54d16c"} Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.562365 4830 scope.go:117] "RemoveContainer" containerID="0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.570093 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:37 crc kubenswrapper[4830]: W0311 09:35:37.585296 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84753481_6831_4cec_8e80_e0141fdad711.slice/crio-ed02fe465a3879c81a95c7982e189be5daf6e8f60ecf6c3c3cfd615bcd60084e WatchSource:0}: Error finding container ed02fe465a3879c81a95c7982e189be5daf6e8f60ecf6c3c3cfd615bcd60084e: Status 404 returned error can't find the container with id ed02fe465a3879c81a95c7982e189be5daf6e8f60ecf6c3c3cfd615bcd60084e Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.646891 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.652925 4830 scope.go:117] "RemoveContainer" containerID="480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.654976 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.672599 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:37 crc kubenswrapper[4830]: E0311 09:35:37.673064 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api-log" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.673084 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api-log" Mar 11 09:35:37 crc kubenswrapper[4830]: E0311 09:35:37.673104 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.673111 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.673332 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.673352 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" containerName="cinder-api-log" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.674471 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.684012 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.684214 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.690285 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691198 4830 scope.go:117] "RemoveContainer" containerID="0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583" Mar 11 09:35:37 crc kubenswrapper[4830]: E0311 09:35:37.691472 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583\": container with ID starting with 0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583 not found: ID does not exist" containerID="0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691500 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583"} err="failed to get container status \"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583\": rpc error: code = NotFound desc = could not find container \"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583\": container with ID starting with 0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583 not found: ID does not exist" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691522 4830 scope.go:117] "RemoveContainer" containerID="480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310" Mar 11 09:35:37 crc kubenswrapper[4830]: E0311 09:35:37.691667 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310\": container with ID starting with 480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310 not found: ID does not exist" containerID="480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691685 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310"} err="failed to get container status \"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310\": rpc error: code = NotFound desc = could not find container \"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310\": container with ID starting with 480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310 not found: ID does not exist" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691696 4830 scope.go:117] "RemoveContainer" containerID="0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691826 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583"} err="failed to get container status \"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583\": rpc error: code = NotFound desc = could not find container \"0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583\": container with ID starting with 0192cf633add35f6c6451aaa17d205aee5cb9436309a7944f776c68b4b598583 not found: ID does not exist" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691843 4830 scope.go:117] "RemoveContainer" containerID="480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.691966 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310"} err="failed to get container status \"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310\": rpc error: code = NotFound desc = could not find container \"480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310\": container with ID starting with 480bc067de0d9a03f532557ccc5a15b2fb8db396ad5f7f7cfdf3a2d4c6b93310 not found: ID does not exist" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.712119 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.797592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcld6\" (UniqueName: \"kubernetes.io/projected/f96f1f82-873d-4665-8273-65bfc41ba374-kube-api-access-zcld6\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.798260 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.798366 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-config-data\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.798405 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f96f1f82-873d-4665-8273-65bfc41ba374-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.798438 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-scripts\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.798466 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-config-data-custom\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.798676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96f1f82-873d-4665-8273-65bfc41ba374-logs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.798962 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.799121 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900442 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-config-data-custom\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96f1f82-873d-4665-8273-65bfc41ba374-logs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900541 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900560 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900597 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcld6\" (UniqueName: \"kubernetes.io/projected/f96f1f82-873d-4665-8273-65bfc41ba374-kube-api-access-zcld6\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900630 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900689 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-config-data\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900709 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f96f1f82-873d-4665-8273-65bfc41ba374-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.900725 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-scripts\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.902206 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f96f1f82-873d-4665-8273-65bfc41ba374-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.903104 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f96f1f82-873d-4665-8273-65bfc41ba374-logs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.905597 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-config-data-custom\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.906386 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.907526 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-config-data\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.910493 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.910894 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-scripts\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.914360 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f96f1f82-873d-4665-8273-65bfc41ba374-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:37 crc kubenswrapper[4830]: I0311 09:35:37.927338 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcld6\" (UniqueName: \"kubernetes.io/projected/f96f1f82-873d-4665-8273-65bfc41ba374-kube-api-access-zcld6\") pod \"cinder-api-0\" (UID: \"f96f1f82-873d-4665-8273-65bfc41ba374\") " pod="openstack/cinder-api-0" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.005691 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.508432 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 11 09:35:38 crc kubenswrapper[4830]: W0311 09:35:38.511817 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf96f1f82_873d_4665_8273_65bfc41ba374.slice/crio-7752e862615e1e41c09e65a3a884f4413fa79f9e495105b3c0e001ee5da6ea05 WatchSource:0}: Error finding container 7752e862615e1e41c09e65a3a884f4413fa79f9e495105b3c0e001ee5da6ea05: Status 404 returned error can't find the container with id 7752e862615e1e41c09e65a3a884f4413fa79f9e495105b3c0e001ee5da6ea05 Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.585278 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f96f1f82-873d-4665-8273-65bfc41ba374","Type":"ContainerStarted","Data":"7752e862615e1e41c09e65a3a884f4413fa79f9e495105b3c0e001ee5da6ea05"} Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.589098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerStarted","Data":"00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0"} Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.589177 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerStarted","Data":"ed02fe465a3879c81a95c7982e189be5daf6e8f60ecf6c3c3cfd615bcd60084e"} Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.727187 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7888bcc99b-t8slf"] Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.728957 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.739541 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.739752 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.752227 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7888bcc99b-t8slf"] Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.819435 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-config-data\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.819496 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-config-data-custom\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.819589 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-combined-ca-bundle\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.819926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njbzp\" (UniqueName: \"kubernetes.io/projected/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-kube-api-access-njbzp\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.820068 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-logs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.820115 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-public-tls-certs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.820312 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-internal-tls-certs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.922142 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-combined-ca-bundle\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.922226 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njbzp\" (UniqueName: \"kubernetes.io/projected/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-kube-api-access-njbzp\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.922260 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-logs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.922279 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-public-tls-certs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.922306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-internal-tls-certs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.922368 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-config-data\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.922388 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-config-data-custom\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.923049 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-logs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.928790 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-config-data-custom\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.934976 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-combined-ca-bundle\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.938815 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-public-tls-certs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.942075 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-config-data\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.942538 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njbzp\" (UniqueName: \"kubernetes.io/projected/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-kube-api-access-njbzp\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.948803 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785431d6-0213-4c97-a8d8-dad66e686f70" path="/var/lib/kubelet/pods/785431d6-0213-4c97-a8d8-dad66e686f70/volumes" Mar 11 09:35:38 crc kubenswrapper[4830]: I0311 09:35:38.956942 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe664eb-daf0-4aeb-ae09-f47b2204bdf1-internal-tls-certs\") pod \"barbican-api-7888bcc99b-t8slf\" (UID: \"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1\") " pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:39 crc kubenswrapper[4830]: I0311 09:35:39.090758 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:39 crc kubenswrapper[4830]: I0311 09:35:39.605004 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f96f1f82-873d-4665-8273-65bfc41ba374","Type":"ContainerStarted","Data":"8d84a9601d19a08f62deddc09d82b9706128841db09f01ab062e9234ac2841b8"} Mar 11 09:35:39 crc kubenswrapper[4830]: I0311 09:35:39.609439 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerStarted","Data":"b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa"} Mar 11 09:35:39 crc kubenswrapper[4830]: I0311 09:35:39.785073 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7888bcc99b-t8slf"] Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.624062 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f96f1f82-873d-4665-8273-65bfc41ba374","Type":"ContainerStarted","Data":"2a57a1f0f272df8f5fe674994453303398dfda30094153d9af6d67131a397bff"} Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.626194 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.629571 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerStarted","Data":"b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2"} Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.632846 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7888bcc99b-t8slf" event={"ID":"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1","Type":"ContainerStarted","Data":"da7cf7d14e051f3157e74ad87a7ead8bfa82feac12b091ed60cd3deb9a983077"} Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.632907 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7888bcc99b-t8slf" event={"ID":"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1","Type":"ContainerStarted","Data":"b047edc1676a4a22328b2075d13b8476d069a4e56e34b53074e03ea8327fceb6"} Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.632928 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7888bcc99b-t8slf" event={"ID":"bbe664eb-daf0-4aeb-ae09-f47b2204bdf1","Type":"ContainerStarted","Data":"e8715e110a697290ccd2d2e268b972e8749ada5cfd47536c43647340446195f1"} Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.633835 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.633869 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.650527 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.650504758 podStartE2EDuration="3.650504758s" podCreationTimestamp="2026-03-11 09:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:40.647858314 +0000 UTC m=+1308.429009013" watchObservedRunningTime="2026-03-11 09:35:40.650504758 +0000 UTC m=+1308.431655447" Mar 11 09:35:40 crc kubenswrapper[4830]: I0311 09:35:40.682930 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7888bcc99b-t8slf" podStartSLOduration=2.682897156 podStartE2EDuration="2.682897156s" podCreationTimestamp="2026-03-11 09:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:40.675654945 +0000 UTC m=+1308.456805654" watchObservedRunningTime="2026-03-11 09:35:40.682897156 +0000 UTC m=+1308.464047845" Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.479826 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6f556978d6-swcm4" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.563916 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.570408 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.654304 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.667818 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-ntjpw"] Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.668248 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" podUID="addc50b8-ce68-4690-9979-07b2f596215d" containerName="dnsmasq-dns" containerID="cri-o://5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524" gracePeriod=10 Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.669121 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerStarted","Data":"b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769"} Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.670698 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.670860 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="cinder-scheduler" containerID="cri-o://0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869" gracePeriod=30 Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.670971 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="probe" containerID="cri-o://4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3" gracePeriod=30 Mar 11 09:35:42 crc kubenswrapper[4830]: I0311 09:35:42.717460 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.511308838 podStartE2EDuration="6.717419122s" podCreationTimestamp="2026-03-11 09:35:36 +0000 UTC" firstStartedPulling="2026-03-11 09:35:37.58865926 +0000 UTC m=+1305.369809949" lastFinishedPulling="2026-03-11 09:35:41.794769544 +0000 UTC m=+1309.575920233" observedRunningTime="2026-03-11 09:35:42.708805773 +0000 UTC m=+1310.489956482" watchObservedRunningTime="2026-03-11 09:35:42.717419122 +0000 UTC m=+1310.498569821" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.210878 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.285877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-sb\") pod \"addc50b8-ce68-4690-9979-07b2f596215d\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.285949 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-svc\") pod \"addc50b8-ce68-4690-9979-07b2f596215d\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.286069 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-config\") pod \"addc50b8-ce68-4690-9979-07b2f596215d\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.286165 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-swift-storage-0\") pod \"addc50b8-ce68-4690-9979-07b2f596215d\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.286220 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb7rk\" (UniqueName: \"kubernetes.io/projected/addc50b8-ce68-4690-9979-07b2f596215d-kube-api-access-fb7rk\") pod \"addc50b8-ce68-4690-9979-07b2f596215d\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.286304 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-nb\") pod \"addc50b8-ce68-4690-9979-07b2f596215d\" (UID: \"addc50b8-ce68-4690-9979-07b2f596215d\") " Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.302561 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/addc50b8-ce68-4690-9979-07b2f596215d-kube-api-access-fb7rk" (OuterVolumeSpecName: "kube-api-access-fb7rk") pod "addc50b8-ce68-4690-9979-07b2f596215d" (UID: "addc50b8-ce68-4690-9979-07b2f596215d"). InnerVolumeSpecName "kube-api-access-fb7rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.362646 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "addc50b8-ce68-4690-9979-07b2f596215d" (UID: "addc50b8-ce68-4690-9979-07b2f596215d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.363602 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "addc50b8-ce68-4690-9979-07b2f596215d" (UID: "addc50b8-ce68-4690-9979-07b2f596215d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.377492 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "addc50b8-ce68-4690-9979-07b2f596215d" (UID: "addc50b8-ce68-4690-9979-07b2f596215d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.377562 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-config" (OuterVolumeSpecName: "config") pod "addc50b8-ce68-4690-9979-07b2f596215d" (UID: "addc50b8-ce68-4690-9979-07b2f596215d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.389141 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.389183 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.389195 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.389206 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb7rk\" (UniqueName: \"kubernetes.io/projected/addc50b8-ce68-4690-9979-07b2f596215d-kube-api-access-fb7rk\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.389218 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.410608 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "addc50b8-ce68-4690-9979-07b2f596215d" (UID: "addc50b8-ce68-4690-9979-07b2f596215d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.490787 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/addc50b8-ce68-4690-9979-07b2f596215d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.575151 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.680488 4830 generic.go:334] "Generic (PLEG): container finished" podID="addc50b8-ce68-4690-9979-07b2f596215d" containerID="5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524" exitCode=0 Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.680542 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.680573 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" event={"ID":"addc50b8-ce68-4690-9979-07b2f596215d","Type":"ContainerDied","Data":"5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524"} Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.680641 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-ntjpw" event={"ID":"addc50b8-ce68-4690-9979-07b2f596215d","Type":"ContainerDied","Data":"fa262b4a620376170430edd07ffbdfc8b7be2c0bda120acef00ae4ed93ba4500"} Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.680688 4830 scope.go:117] "RemoveContainer" containerID="5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.682833 4830 generic.go:334] "Generic (PLEG): container finished" podID="5484cee6-2855-4f08-a439-991e26f49c1f" containerID="4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3" exitCode=0 Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.682916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5484cee6-2855-4f08-a439-991e26f49c1f","Type":"ContainerDied","Data":"4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3"} Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.704867 4830 scope.go:117] "RemoveContainer" containerID="c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.724322 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-ntjpw"] Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.734367 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-ntjpw"] Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.740179 4830 scope.go:117] "RemoveContainer" containerID="5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524" Mar 11 09:35:43 crc kubenswrapper[4830]: E0311 09:35:43.740577 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524\": container with ID starting with 5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524 not found: ID does not exist" containerID="5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.740624 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524"} err="failed to get container status \"5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524\": rpc error: code = NotFound desc = could not find container \"5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524\": container with ID starting with 5374bf4a94b384e5431e36eed846f8b9c9108ea3c5d115b75e38c8a779c69524 not found: ID does not exist" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.740652 4830 scope.go:117] "RemoveContainer" containerID="c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b" Mar 11 09:35:43 crc kubenswrapper[4830]: E0311 09:35:43.740911 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b\": container with ID starting with c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b not found: ID does not exist" containerID="c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b" Mar 11 09:35:43 crc kubenswrapper[4830]: I0311 09:35:43.740955 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b"} err="failed to get container status \"c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b\": rpc error: code = NotFound desc = could not find container \"c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b\": container with ID starting with c3a1bd32378ae05de417b367dcba03207ccbbfd7f411a7234f62c003d678994b not found: ID does not exist" Mar 11 09:35:44 crc kubenswrapper[4830]: I0311 09:35:44.029203 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:44 crc kubenswrapper[4830]: I0311 09:35:44.407323 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:35:44 crc kubenswrapper[4830]: I0311 09:35:44.701656 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:35:44 crc kubenswrapper[4830]: I0311 09:35:44.947148 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="addc50b8-ce68-4690-9979-07b2f596215d" path="/var/lib/kubelet/pods/addc50b8-ce68-4690-9979-07b2f596215d/volumes" Mar 11 09:35:45 crc kubenswrapper[4830]: I0311 09:35:45.596222 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-559977bfdc-r7ssx" Mar 11 09:35:46 crc kubenswrapper[4830]: I0311 09:35:46.234248 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:35:46 crc kubenswrapper[4830]: I0311 09:35:46.661602 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-789dc4b6cd-xz7ds" Mar 11 09:35:46 crc kubenswrapper[4830]: I0311 09:35:46.736605 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f6b87df74-q5t2v"] Mar 11 09:35:46 crc kubenswrapper[4830]: I0311 09:35:46.736848 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f6b87df74-q5t2v" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon-log" containerID="cri-o://af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10" gracePeriod=30 Mar 11 09:35:46 crc kubenswrapper[4830]: I0311 09:35:46.737168 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f6b87df74-q5t2v" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" containerID="cri-o://e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8" gracePeriod=30 Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.651827 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.671291 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-combined-ca-bundle\") pod \"5484cee6-2855-4f08-a439-991e26f49c1f\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.671342 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data\") pod \"5484cee6-2855-4f08-a439-991e26f49c1f\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.671453 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d97gp\" (UniqueName: \"kubernetes.io/projected/5484cee6-2855-4f08-a439-991e26f49c1f-kube-api-access-d97gp\") pod \"5484cee6-2855-4f08-a439-991e26f49c1f\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.671495 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5484cee6-2855-4f08-a439-991e26f49c1f-etc-machine-id\") pod \"5484cee6-2855-4f08-a439-991e26f49c1f\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.671550 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-scripts\") pod \"5484cee6-2855-4f08-a439-991e26f49c1f\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.671585 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data-custom\") pod \"5484cee6-2855-4f08-a439-991e26f49c1f\" (UID: \"5484cee6-2855-4f08-a439-991e26f49c1f\") " Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.672183 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5484cee6-2855-4f08-a439-991e26f49c1f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5484cee6-2855-4f08-a439-991e26f49c1f" (UID: "5484cee6-2855-4f08-a439-991e26f49c1f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.678086 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-scripts" (OuterVolumeSpecName: "scripts") pod "5484cee6-2855-4f08-a439-991e26f49c1f" (UID: "5484cee6-2855-4f08-a439-991e26f49c1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.700595 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5484cee6-2855-4f08-a439-991e26f49c1f" (UID: "5484cee6-2855-4f08-a439-991e26f49c1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.705328 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5484cee6-2855-4f08-a439-991e26f49c1f-kube-api-access-d97gp" (OuterVolumeSpecName: "kube-api-access-d97gp") pod "5484cee6-2855-4f08-a439-991e26f49c1f" (UID: "5484cee6-2855-4f08-a439-991e26f49c1f"). InnerVolumeSpecName "kube-api-access-d97gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.751965 4830 generic.go:334] "Generic (PLEG): container finished" podID="5484cee6-2855-4f08-a439-991e26f49c1f" containerID="0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869" exitCode=0 Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.752007 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5484cee6-2855-4f08-a439-991e26f49c1f","Type":"ContainerDied","Data":"0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869"} Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.752049 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5484cee6-2855-4f08-a439-991e26f49c1f","Type":"ContainerDied","Data":"42525cd884b1bf0fb02628686a268883e52602688c561c25cb76a2423d472b4a"} Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.752064 4830 scope.go:117] "RemoveContainer" containerID="4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.752179 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.757696 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5484cee6-2855-4f08-a439-991e26f49c1f" (UID: "5484cee6-2855-4f08-a439-991e26f49c1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.773574 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.773603 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.773614 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d97gp\" (UniqueName: \"kubernetes.io/projected/5484cee6-2855-4f08-a439-991e26f49c1f-kube-api-access-d97gp\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.773626 4830 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5484cee6-2855-4f08-a439-991e26f49c1f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.773635 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.794383 4830 scope.go:117] "RemoveContainer" containerID="0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.799065 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data" (OuterVolumeSpecName: "config-data") pod "5484cee6-2855-4f08-a439-991e26f49c1f" (UID: "5484cee6-2855-4f08-a439-991e26f49c1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.820541 4830 scope.go:117] "RemoveContainer" containerID="4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3" Mar 11 09:35:47 crc kubenswrapper[4830]: E0311 09:35:47.821913 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3\": container with ID starting with 4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3 not found: ID does not exist" containerID="4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.821972 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3"} err="failed to get container status \"4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3\": rpc error: code = NotFound desc = could not find container \"4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3\": container with ID starting with 4a004141310f920bbc1f159357cb7548dc0eda77101f527eb31946c2d0ea18f3 not found: ID does not exist" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.821999 4830 scope.go:117] "RemoveContainer" containerID="0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869" Mar 11 09:35:47 crc kubenswrapper[4830]: E0311 09:35:47.823498 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869\": container with ID starting with 0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869 not found: ID does not exist" containerID="0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.823542 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869"} err="failed to get container status \"0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869\": rpc error: code = NotFound desc = could not find container \"0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869\": container with ID starting with 0d9a23a93b103d568e33fa290903e33cfe1a80cdac75bc80b638d5577512a869 not found: ID does not exist" Mar 11 09:35:47 crc kubenswrapper[4830]: I0311 09:35:47.877819 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5484cee6-2855-4f08-a439-991e26f49c1f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.091502 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.108072 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.146074 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:48 crc kubenswrapper[4830]: E0311 09:35:48.146598 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="addc50b8-ce68-4690-9979-07b2f596215d" containerName="init" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.147322 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="addc50b8-ce68-4690-9979-07b2f596215d" containerName="init" Mar 11 09:35:48 crc kubenswrapper[4830]: E0311 09:35:48.147539 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="cinder-scheduler" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.147553 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="cinder-scheduler" Mar 11 09:35:48 crc kubenswrapper[4830]: E0311 09:35:48.147583 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="addc50b8-ce68-4690-9979-07b2f596215d" containerName="dnsmasq-dns" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.147593 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="addc50b8-ce68-4690-9979-07b2f596215d" containerName="dnsmasq-dns" Mar 11 09:35:48 crc kubenswrapper[4830]: E0311 09:35:48.147611 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="probe" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.147622 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="probe" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.147860 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="probe" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.147878 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" containerName="cinder-scheduler" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.147904 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="addc50b8-ce68-4690-9979-07b2f596215d" containerName="dnsmasq-dns" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.149148 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.154928 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.179577 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.187976 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.188077 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.188129 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj8rr\" (UniqueName: \"kubernetes.io/projected/65554cb9-6d98-4e70-8feb-73029d8184dc-kube-api-access-nj8rr\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.188165 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.188195 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.188212 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65554cb9-6d98-4e70-8feb-73029d8184dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.290373 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.290464 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj8rr\" (UniqueName: \"kubernetes.io/projected/65554cb9-6d98-4e70-8feb-73029d8184dc-kube-api-access-nj8rr\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.290515 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.290556 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65554cb9-6d98-4e70-8feb-73029d8184dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.290575 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.290647 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.291618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/65554cb9-6d98-4e70-8feb-73029d8184dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.295168 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.295478 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.304548 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.305882 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65554cb9-6d98-4e70-8feb-73029d8184dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.318359 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj8rr\" (UniqueName: \"kubernetes.io/projected/65554cb9-6d98-4e70-8feb-73029d8184dc-kube-api-access-nj8rr\") pod \"cinder-scheduler-0\" (UID: \"65554cb9-6d98-4e70-8feb-73029d8184dc\") " pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.483192 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 11 09:35:48 crc kubenswrapper[4830]: I0311 09:35:48.944966 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5484cee6-2855-4f08-a439-991e26f49c1f" path="/var/lib/kubelet/pods/5484cee6-2855-4f08-a439-991e26f49c1f/volumes" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.013464 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.035500 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.775700 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65554cb9-6d98-4e70-8feb-73029d8184dc","Type":"ContainerStarted","Data":"ec7ad545173a6469652f4223b70399772bbf2e6937a249493c4fea2fec6ea87e"} Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.797776 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.804679 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.807506 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-xbglf" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.809348 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.809524 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.826831 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.834721 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c113279e-3264-4a62-8c50-5ddb2be700bb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.834805 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c113279e-3264-4a62-8c50-5ddb2be700bb-openstack-config\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.834847 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c113279e-3264-4a62-8c50-5ddb2be700bb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.834905 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfchr\" (UniqueName: \"kubernetes.io/projected/c113279e-3264-4a62-8c50-5ddb2be700bb-kube-api-access-kfchr\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.938091 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfchr\" (UniqueName: \"kubernetes.io/projected/c113279e-3264-4a62-8c50-5ddb2be700bb-kube-api-access-kfchr\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.939106 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c113279e-3264-4a62-8c50-5ddb2be700bb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.946540 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c113279e-3264-4a62-8c50-5ddb2be700bb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.947391 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c113279e-3264-4a62-8c50-5ddb2be700bb-openstack-config\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.947429 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c113279e-3264-4a62-8c50-5ddb2be700bb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.948648 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c113279e-3264-4a62-8c50-5ddb2be700bb-openstack-config\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.952480 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c113279e-3264-4a62-8c50-5ddb2be700bb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:49 crc kubenswrapper[4830]: I0311 09:35:49.962605 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfchr\" (UniqueName: \"kubernetes.io/projected/c113279e-3264-4a62-8c50-5ddb2be700bb-kube-api-access-kfchr\") pod \"openstackclient\" (UID: \"c113279e-3264-4a62-8c50-5ddb2be700bb\") " pod="openstack/openstackclient" Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.231677 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.463943 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.788340 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65554cb9-6d98-4e70-8feb-73029d8184dc","Type":"ContainerStarted","Data":"3dd3649ddec0b6ff10d5a14f2622da7fb71426f83ede9b2479ee3bbf3c7af022"} Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.790261 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"65554cb9-6d98-4e70-8feb-73029d8184dc","Type":"ContainerStarted","Data":"c2e8bde4c0330582e2a3c0416d9acab29ddc53a5d6f7cb34ffaf61da1dd1e54b"} Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.791306 4830 generic.go:334] "Generic (PLEG): container finished" podID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerID="e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8" exitCode=0 Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.791420 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6b87df74-q5t2v" event={"ID":"242c5a27-bc92-42f0-b630-6d1f3cd55822","Type":"ContainerDied","Data":"e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8"} Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.817893 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.81783844 podStartE2EDuration="2.81783844s" podCreationTimestamp="2026-03-11 09:35:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:50.805624022 +0000 UTC m=+1318.586774731" watchObservedRunningTime="2026-03-11 09:35:50.81783844 +0000 UTC m=+1318.598989149" Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.881257 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 11 09:35:50 crc kubenswrapper[4830]: I0311 09:35:50.954174 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.432276 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54c74bff69-478cc" Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.504193 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-586c785596-k4qp7"] Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.504554 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-586c785596-k4qp7" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-api" containerID="cri-o://116297744a0bf959e008764ba107ace74be7cbcf5efe31f187e28c2080a3ec4c" gracePeriod=30 Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.504752 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-586c785596-k4qp7" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-httpd" containerID="cri-o://6c94df8ed2b30d7dd1549194d3511d02e42eb5b3aa1da0ebf8f3399a63de4d72" gracePeriod=30 Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.657645 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-599b4448-86g7s" Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.829933 4830 generic.go:334] "Generic (PLEG): container finished" podID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerID="6c94df8ed2b30d7dd1549194d3511d02e42eb5b3aa1da0ebf8f3399a63de4d72" exitCode=0 Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.830007 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586c785596-k4qp7" event={"ID":"af048295-8bc1-42cb-8f67-3049b2dc4215","Type":"ContainerDied","Data":"6c94df8ed2b30d7dd1549194d3511d02e42eb5b3aa1da0ebf8f3399a63de4d72"} Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.835468 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c113279e-3264-4a62-8c50-5ddb2be700bb","Type":"ContainerStarted","Data":"5b694a754711cde891afe1cebd291391fde38d9d5bd1889c6fda15b0978dfd6a"} Mar 11 09:35:51 crc kubenswrapper[4830]: I0311 09:35:51.835790 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:52 crc kubenswrapper[4830]: I0311 09:35:52.011322 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5f6b87df74-q5t2v" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 11 09:35:52 crc kubenswrapper[4830]: I0311 09:35:52.931592 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7888bcc99b-t8slf" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.029337 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f556978d6-swcm4"] Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.029581 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f556978d6-swcm4" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api-log" containerID="cri-o://25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e" gracePeriod=30 Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.029957 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6f556978d6-swcm4" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api" containerID="cri-o://a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8" gracePeriod=30 Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.490148 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.492166 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7f56859c77-bnqc2"] Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.500106 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.504814 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.505214 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.509156 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.515691 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f56859c77-bnqc2"] Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.670810 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-config-data\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.670860 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-etc-swift\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.670910 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-internal-tls-certs\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.670941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffszr\" (UniqueName: \"kubernetes.io/projected/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-kube-api-access-ffszr\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.670968 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-run-httpd\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.670999 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-combined-ca-bundle\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.671035 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-public-tls-certs\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.671061 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-log-httpd\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.772326 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-config-data\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.772410 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-etc-swift\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.772584 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-internal-tls-certs\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.773305 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffszr\" (UniqueName: \"kubernetes.io/projected/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-kube-api-access-ffszr\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.773358 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-run-httpd\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.773413 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-combined-ca-bundle\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.773447 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-public-tls-certs\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.773498 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-log-httpd\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.773919 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-log-httpd\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.774487 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-run-httpd\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.782129 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-etc-swift\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.783782 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-combined-ca-bundle\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.788977 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-internal-tls-certs\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.789175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-public-tls-certs\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.791957 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-config-data\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.807708 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffszr\" (UniqueName: \"kubernetes.io/projected/fcbbcfad-16c7-4040-9b06-b2ff9f4c5666-kube-api-access-ffszr\") pod \"swift-proxy-7f56859c77-bnqc2\" (UID: \"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666\") " pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.835977 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.876876 4830 generic.go:334] "Generic (PLEG): container finished" podID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerID="25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e" exitCode=143 Mar 11 09:35:53 crc kubenswrapper[4830]: I0311 09:35:53.876957 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f556978d6-swcm4" event={"ID":"41520592-8a80-47a4-a85c-6372ad2b5d28","Type":"ContainerDied","Data":"25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e"} Mar 11 09:35:54 crc kubenswrapper[4830]: I0311 09:35:54.498230 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f56859c77-bnqc2"] Mar 11 09:35:54 crc kubenswrapper[4830]: I0311 09:35:54.889577 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f56859c77-bnqc2" event={"ID":"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666","Type":"ContainerStarted","Data":"9ea5454c07724a6d8e5e96f3441269f65415830e383ba9e07f9087b0142543e9"} Mar 11 09:35:54 crc kubenswrapper[4830]: I0311 09:35:54.889914 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f56859c77-bnqc2" event={"ID":"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666","Type":"ContainerStarted","Data":"6fe9f18c8218a21d0a357e470f8ad533b95c4f5db36d0784570fa83ea3ef80a8"} Mar 11 09:35:55 crc kubenswrapper[4830]: I0311 09:35:55.919102 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f56859c77-bnqc2" event={"ID":"fcbbcfad-16c7-4040-9b06-b2ff9f4c5666","Type":"ContainerStarted","Data":"6939a816383f66b57b809d67a39690d2b1bedf0910e51b5d74459a5f021a59b2"} Mar 11 09:35:55 crc kubenswrapper[4830]: I0311 09:35:55.920176 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:55 crc kubenswrapper[4830]: I0311 09:35:55.920210 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:35:55 crc kubenswrapper[4830]: I0311 09:35:55.957314 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7f56859c77-bnqc2" podStartSLOduration=2.95729416 podStartE2EDuration="2.95729416s" podCreationTimestamp="2026-03-11 09:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:35:55.945702038 +0000 UTC m=+1323.726852747" watchObservedRunningTime="2026-03-11 09:35:55.95729416 +0000 UTC m=+1323.738444849" Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.333077 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.333614 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-central-agent" containerID="cri-o://00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0" gracePeriod=30 Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.333684 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="proxy-httpd" containerID="cri-o://b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769" gracePeriod=30 Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.333749 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-notification-agent" containerID="cri-o://b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa" gracePeriod=30 Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.333886 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="sg-core" containerID="cri-o://b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2" gracePeriod=30 Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.346365 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": EOF" Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.924798 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.948886 4830 generic.go:334] "Generic (PLEG): container finished" podID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerID="a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8" exitCode=0 Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.949053 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f556978d6-swcm4" Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.949766 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f556978d6-swcm4" event={"ID":"41520592-8a80-47a4-a85c-6372ad2b5d28","Type":"ContainerDied","Data":"a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8"} Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.949792 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f556978d6-swcm4" event={"ID":"41520592-8a80-47a4-a85c-6372ad2b5d28","Type":"ContainerDied","Data":"b4213d17044e3c4d1f7306335653867de28a8f2a64e321c98d65e53d3055c9f6"} Mar 11 09:35:56 crc kubenswrapper[4830]: I0311 09:35:56.949809 4830 scope.go:117] "RemoveContainer" containerID="a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.014187 4830 scope.go:117] "RemoveContainer" containerID="25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.023257 4830 generic.go:334] "Generic (PLEG): container finished" podID="84753481-6831-4cec-8e80-e0141fdad711" containerID="b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769" exitCode=0 Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.023302 4830 generic.go:334] "Generic (PLEG): container finished" podID="84753481-6831-4cec-8e80-e0141fdad711" containerID="b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2" exitCode=2 Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.023313 4830 generic.go:334] "Generic (PLEG): container finished" podID="84753481-6831-4cec-8e80-e0141fdad711" containerID="b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa" exitCode=0 Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.024541 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerDied","Data":"b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769"} Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.024579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerDied","Data":"b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2"} Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.024595 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerDied","Data":"b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa"} Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.052675 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data-custom\") pod \"41520592-8a80-47a4-a85c-6372ad2b5d28\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.052732 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41520592-8a80-47a4-a85c-6372ad2b5d28-logs\") pod \"41520592-8a80-47a4-a85c-6372ad2b5d28\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.052816 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data\") pod \"41520592-8a80-47a4-a85c-6372ad2b5d28\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.052935 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfmb5\" (UniqueName: \"kubernetes.io/projected/41520592-8a80-47a4-a85c-6372ad2b5d28-kube-api-access-cfmb5\") pod \"41520592-8a80-47a4-a85c-6372ad2b5d28\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.052964 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-combined-ca-bundle\") pod \"41520592-8a80-47a4-a85c-6372ad2b5d28\" (UID: \"41520592-8a80-47a4-a85c-6372ad2b5d28\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.060379 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41520592-8a80-47a4-a85c-6372ad2b5d28-logs" (OuterVolumeSpecName: "logs") pod "41520592-8a80-47a4-a85c-6372ad2b5d28" (UID: "41520592-8a80-47a4-a85c-6372ad2b5d28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.082046 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "41520592-8a80-47a4-a85c-6372ad2b5d28" (UID: "41520592-8a80-47a4-a85c-6372ad2b5d28"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.089282 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41520592-8a80-47a4-a85c-6372ad2b5d28-kube-api-access-cfmb5" (OuterVolumeSpecName: "kube-api-access-cfmb5") pod "41520592-8a80-47a4-a85c-6372ad2b5d28" (UID: "41520592-8a80-47a4-a85c-6372ad2b5d28"). InnerVolumeSpecName "kube-api-access-cfmb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.091612 4830 scope.go:117] "RemoveContainer" containerID="a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8" Mar 11 09:35:57 crc kubenswrapper[4830]: E0311 09:35:57.092881 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8\": container with ID starting with a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8 not found: ID does not exist" containerID="a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.092928 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8"} err="failed to get container status \"a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8\": rpc error: code = NotFound desc = could not find container \"a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8\": container with ID starting with a11d130f31ada9664ce6bef67f73119b37b1f04b8e2e8a3f6a7d57b06d08c9a8 not found: ID does not exist" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.092955 4830 scope.go:117] "RemoveContainer" containerID="25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e" Mar 11 09:35:57 crc kubenswrapper[4830]: E0311 09:35:57.093303 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e\": container with ID starting with 25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e not found: ID does not exist" containerID="25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.093323 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e"} err="failed to get container status \"25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e\": rpc error: code = NotFound desc = could not find container \"25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e\": container with ID starting with 25a8b5695d484176dfa17e2922200ea2768195f5879d4f26fc9f9d97eb98d93e not found: ID does not exist" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.367247 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41520592-8a80-47a4-a85c-6372ad2b5d28" (UID: "41520592-8a80-47a4-a85c-6372ad2b5d28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.368474 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfmb5\" (UniqueName: \"kubernetes.io/projected/41520592-8a80-47a4-a85c-6372ad2b5d28-kube-api-access-cfmb5\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.368502 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.368512 4830 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.368522 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41520592-8a80-47a4-a85c-6372ad2b5d28-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.382276 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data" (OuterVolumeSpecName: "config-data") pod "41520592-8a80-47a4-a85c-6372ad2b5d28" (UID: "41520592-8a80-47a4-a85c-6372ad2b5d28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.469766 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41520592-8a80-47a4-a85c-6372ad2b5d28-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.641867 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6f556978d6-swcm4"] Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.665524 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6f556978d6-swcm4"] Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.873681 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.982604 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-log-httpd\") pod \"84753481-6831-4cec-8e80-e0141fdad711\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.982665 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvt5\" (UniqueName: \"kubernetes.io/projected/84753481-6831-4cec-8e80-e0141fdad711-kube-api-access-hpvt5\") pod \"84753481-6831-4cec-8e80-e0141fdad711\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.982750 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-config-data\") pod \"84753481-6831-4cec-8e80-e0141fdad711\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.983307 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-scripts\") pod \"84753481-6831-4cec-8e80-e0141fdad711\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.983383 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-sg-core-conf-yaml\") pod \"84753481-6831-4cec-8e80-e0141fdad711\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.983438 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-run-httpd\") pod \"84753481-6831-4cec-8e80-e0141fdad711\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.983470 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-combined-ca-bundle\") pod \"84753481-6831-4cec-8e80-e0141fdad711\" (UID: \"84753481-6831-4cec-8e80-e0141fdad711\") " Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.984699 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "84753481-6831-4cec-8e80-e0141fdad711" (UID: "84753481-6831-4cec-8e80-e0141fdad711"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:57 crc kubenswrapper[4830]: I0311 09:35:57.985156 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "84753481-6831-4cec-8e80-e0141fdad711" (UID: "84753481-6831-4cec-8e80-e0141fdad711"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.005544 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84753481-6831-4cec-8e80-e0141fdad711-kube-api-access-hpvt5" (OuterVolumeSpecName: "kube-api-access-hpvt5") pod "84753481-6831-4cec-8e80-e0141fdad711" (UID: "84753481-6831-4cec-8e80-e0141fdad711"). InnerVolumeSpecName "kube-api-access-hpvt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.011652 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "84753481-6831-4cec-8e80-e0141fdad711" (UID: "84753481-6831-4cec-8e80-e0141fdad711"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.017762 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-scripts" (OuterVolumeSpecName: "scripts") pod "84753481-6831-4cec-8e80-e0141fdad711" (UID: "84753481-6831-4cec-8e80-e0141fdad711"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.045313 4830 generic.go:334] "Generic (PLEG): container finished" podID="84753481-6831-4cec-8e80-e0141fdad711" containerID="00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0" exitCode=0 Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.045452 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerDied","Data":"00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0"} Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.045583 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.045956 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84753481-6831-4cec-8e80-e0141fdad711","Type":"ContainerDied","Data":"ed02fe465a3879c81a95c7982e189be5daf6e8f60ecf6c3c3cfd615bcd60084e"} Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.046048 4830 scope.go:117] "RemoveContainer" containerID="b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.076441 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84753481-6831-4cec-8e80-e0141fdad711" (UID: "84753481-6831-4cec-8e80-e0141fdad711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.087246 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.087275 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.087285 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.087294 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.087303 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84753481-6831-4cec-8e80-e0141fdad711-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.087311 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvt5\" (UniqueName: \"kubernetes.io/projected/84753481-6831-4cec-8e80-e0141fdad711-kube-api-access-hpvt5\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.108436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-config-data" (OuterVolumeSpecName: "config-data") pod "84753481-6831-4cec-8e80-e0141fdad711" (UID: "84753481-6831-4cec-8e80-e0141fdad711"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.118476 4830 scope.go:117] "RemoveContainer" containerID="b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.144385 4830 scope.go:117] "RemoveContainer" containerID="b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.176212 4830 scope.go:117] "RemoveContainer" containerID="00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.189533 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84753481-6831-4cec-8e80-e0141fdad711-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.201494 4830 scope.go:117] "RemoveContainer" containerID="b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.202105 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769\": container with ID starting with b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769 not found: ID does not exist" containerID="b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.202142 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769"} err="failed to get container status \"b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769\": rpc error: code = NotFound desc = could not find container \"b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769\": container with ID starting with b731c06baddb4a635bbeb123ab0423a28a3914fceb0eb887c1e3b0136f40b769 not found: ID does not exist" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.202163 4830 scope.go:117] "RemoveContainer" containerID="b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.202530 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2\": container with ID starting with b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2 not found: ID does not exist" containerID="b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.202566 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2"} err="failed to get container status \"b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2\": rpc error: code = NotFound desc = could not find container \"b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2\": container with ID starting with b4bbfbd27d73212c84ed9b44c3e85815113c71ccf430c819adf8168a5cb686b2 not found: ID does not exist" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.202588 4830 scope.go:117] "RemoveContainer" containerID="b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.202814 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa\": container with ID starting with b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa not found: ID does not exist" containerID="b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.202838 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa"} err="failed to get container status \"b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa\": rpc error: code = NotFound desc = could not find container \"b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa\": container with ID starting with b6d055bdb06e647992a9cdfe0b0e54ce4e6d124a17c66144175d804e4494b4fa not found: ID does not exist" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.202874 4830 scope.go:117] "RemoveContainer" containerID="00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.203193 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0\": container with ID starting with 00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0 not found: ID does not exist" containerID="00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.203222 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0"} err="failed to get container status \"00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0\": rpc error: code = NotFound desc = could not find container \"00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0\": container with ID starting with 00cb4141897f40fb9b1e21d235874c6270cc75722ee8e5832cbb87fe022fe4a0 not found: ID does not exist" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.419093 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.439859 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449071 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.449400 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-notification-agent" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449417 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-notification-agent" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.449436 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api-log" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449442 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api-log" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.449453 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449459 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.449484 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="sg-core" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449490 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="sg-core" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.449501 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-central-agent" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449506 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-central-agent" Mar 11 09:35:58 crc kubenswrapper[4830]: E0311 09:35:58.449517 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="proxy-httpd" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449524 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="proxy-httpd" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449667 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449681 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-notification-agent" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449694 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="ceilometer-central-agent" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449712 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="sg-core" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449724 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="84753481-6831-4cec-8e80-e0141fdad711" containerName="proxy-httpd" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.449733 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" containerName="barbican-api-log" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.451289 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.453726 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.453999 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.459820 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.496655 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8hh8\" (UniqueName: \"kubernetes.io/projected/7b227cb3-df95-4a38-b843-ff5cfe922fe1-kube-api-access-w8hh8\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.496720 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.496890 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-config-data\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.496977 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-scripts\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.497099 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.497131 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-run-httpd\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.497167 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-log-httpd\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.599335 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-scripts\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.599768 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.599795 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-run-httpd\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.599823 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-log-httpd\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.599895 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8hh8\" (UniqueName: \"kubernetes.io/projected/7b227cb3-df95-4a38-b843-ff5cfe922fe1-kube-api-access-w8hh8\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.599933 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.599992 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-config-data\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.600690 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-run-httpd\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.601084 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-log-httpd\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.605425 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-scripts\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.608084 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.612228 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-config-data\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.613613 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.623115 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8hh8\" (UniqueName: \"kubernetes.io/projected/7b227cb3-df95-4a38-b843-ff5cfe922fe1-kube-api-access-w8hh8\") pod \"ceilometer-0\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.727689 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.777111 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.953224 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41520592-8a80-47a4-a85c-6372ad2b5d28" path="/var/lib/kubelet/pods/41520592-8a80-47a4-a85c-6372ad2b5d28/volumes" Mar 11 09:35:58 crc kubenswrapper[4830]: I0311 09:35:58.953950 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84753481-6831-4cec-8e80-e0141fdad711" path="/var/lib/kubelet/pods/84753481-6831-4cec-8e80-e0141fdad711/volumes" Mar 11 09:35:59 crc kubenswrapper[4830]: I0311 09:35:59.059161 4830 generic.go:334] "Generic (PLEG): container finished" podID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerID="116297744a0bf959e008764ba107ace74be7cbcf5efe31f187e28c2080a3ec4c" exitCode=0 Mar 11 09:35:59 crc kubenswrapper[4830]: I0311 09:35:59.059587 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586c785596-k4qp7" event={"ID":"af048295-8bc1-42cb-8f67-3049b2dc4215","Type":"ContainerDied","Data":"116297744a0bf959e008764ba107ace74be7cbcf5efe31f187e28c2080a3ec4c"} Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.146493 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553696-9lrt8"] Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.147669 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553696-9lrt8" Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.150933 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.151986 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.153715 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.157704 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553696-9lrt8"] Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.335539 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7454p\" (UniqueName: \"kubernetes.io/projected/04bdf020-11b4-4125-ad87-0a30df4278b9-kube-api-access-7454p\") pod \"auto-csr-approver-29553696-9lrt8\" (UID: \"04bdf020-11b4-4125-ad87-0a30df4278b9\") " pod="openshift-infra/auto-csr-approver-29553696-9lrt8" Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.439459 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7454p\" (UniqueName: \"kubernetes.io/projected/04bdf020-11b4-4125-ad87-0a30df4278b9-kube-api-access-7454p\") pod \"auto-csr-approver-29553696-9lrt8\" (UID: \"04bdf020-11b4-4125-ad87-0a30df4278b9\") " pod="openshift-infra/auto-csr-approver-29553696-9lrt8" Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.443068 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.465130 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7454p\" (UniqueName: \"kubernetes.io/projected/04bdf020-11b4-4125-ad87-0a30df4278b9-kube-api-access-7454p\") pod \"auto-csr-approver-29553696-9lrt8\" (UID: \"04bdf020-11b4-4125-ad87-0a30df4278b9\") " pod="openshift-infra/auto-csr-approver-29553696-9lrt8" Mar 11 09:36:00 crc kubenswrapper[4830]: I0311 09:36:00.476139 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553696-9lrt8" Mar 11 09:36:02 crc kubenswrapper[4830]: I0311 09:36:02.011330 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5f6b87df74-q5t2v" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.843314 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.845463 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f56859c77-bnqc2" Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.869297 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.918208 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-httpd-config\") pod \"af048295-8bc1-42cb-8f67-3049b2dc4215\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.918308 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-ovndb-tls-certs\") pod \"af048295-8bc1-42cb-8f67-3049b2dc4215\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.918380 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26xqv\" (UniqueName: \"kubernetes.io/projected/af048295-8bc1-42cb-8f67-3049b2dc4215-kube-api-access-26xqv\") pod \"af048295-8bc1-42cb-8f67-3049b2dc4215\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.918418 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-config\") pod \"af048295-8bc1-42cb-8f67-3049b2dc4215\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.918450 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-combined-ca-bundle\") pod \"af048295-8bc1-42cb-8f67-3049b2dc4215\" (UID: \"af048295-8bc1-42cb-8f67-3049b2dc4215\") " Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.929418 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af048295-8bc1-42cb-8f67-3049b2dc4215-kube-api-access-26xqv" (OuterVolumeSpecName: "kube-api-access-26xqv") pod "af048295-8bc1-42cb-8f67-3049b2dc4215" (UID: "af048295-8bc1-42cb-8f67-3049b2dc4215"). InnerVolumeSpecName "kube-api-access-26xqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.944334 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "af048295-8bc1-42cb-8f67-3049b2dc4215" (UID: "af048295-8bc1-42cb-8f67-3049b2dc4215"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.981973 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-config" (OuterVolumeSpecName: "config") pod "af048295-8bc1-42cb-8f67-3049b2dc4215" (UID: "af048295-8bc1-42cb-8f67-3049b2dc4215"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:03 crc kubenswrapper[4830]: I0311 09:36:03.997915 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "af048295-8bc1-42cb-8f67-3049b2dc4215" (UID: "af048295-8bc1-42cb-8f67-3049b2dc4215"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.000309 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af048295-8bc1-42cb-8f67-3049b2dc4215" (UID: "af048295-8bc1-42cb-8f67-3049b2dc4215"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.020463 4830 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.020561 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26xqv\" (UniqueName: \"kubernetes.io/projected/af048295-8bc1-42cb-8f67-3049b2dc4215-kube-api-access-26xqv\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.020582 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.020597 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.020609 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/af048295-8bc1-42cb-8f67-3049b2dc4215-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.074114 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:04 crc kubenswrapper[4830]: W0311 09:36:04.075911 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b227cb3_df95_4a38_b843_ff5cfe922fe1.slice/crio-ab35d7d7cc735455d1080123cc5a3d101fff16e4fcf6ea307c98be4eab3f29c1 WatchSource:0}: Error finding container ab35d7d7cc735455d1080123cc5a3d101fff16e4fcf6ea307c98be4eab3f29c1: Status 404 returned error can't find the container with id ab35d7d7cc735455d1080123cc5a3d101fff16e4fcf6ea307c98be4eab3f29c1 Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.131962 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c113279e-3264-4a62-8c50-5ddb2be700bb","Type":"ContainerStarted","Data":"c439157b9892be3d9dc5b2c74082a487890ebfa0df7dcc2bd2a2e98c9ce5f840"} Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.133614 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerStarted","Data":"ab35d7d7cc735455d1080123cc5a3d101fff16e4fcf6ea307c98be4eab3f29c1"} Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.135383 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-586c785596-k4qp7" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.135404 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-586c785596-k4qp7" event={"ID":"af048295-8bc1-42cb-8f67-3049b2dc4215","Type":"ContainerDied","Data":"f44aeabdbcdbebd6ed7ccb05a04910e8be6e86a94d8b632834a18e5063e2bb4c"} Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.135452 4830 scope.go:117] "RemoveContainer" containerID="6c94df8ed2b30d7dd1549194d3511d02e42eb5b3aa1da0ebf8f3399a63de4d72" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.153518 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.444059294 podStartE2EDuration="15.153501796s" podCreationTimestamp="2026-03-11 09:35:49 +0000 UTC" firstStartedPulling="2026-03-11 09:35:50.887830232 +0000 UTC m=+1318.668980921" lastFinishedPulling="2026-03-11 09:36:03.597272734 +0000 UTC m=+1331.378423423" observedRunningTime="2026-03-11 09:36:04.148639431 +0000 UTC m=+1331.929790110" watchObservedRunningTime="2026-03-11 09:36:04.153501796 +0000 UTC m=+1331.934652485" Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.188619 4830 scope.go:117] "RemoveContainer" containerID="116297744a0bf959e008764ba107ace74be7cbcf5efe31f187e28c2080a3ec4c" Mar 11 09:36:04 crc kubenswrapper[4830]: W0311 09:36:04.193443 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04bdf020_11b4_4125_ad87_0a30df4278b9.slice/crio-46da1fb0f19798081e652e813431092ba61b39ec2615945e1dd11c5d886ce792 WatchSource:0}: Error finding container 46da1fb0f19798081e652e813431092ba61b39ec2615945e1dd11c5d886ce792: Status 404 returned error can't find the container with id 46da1fb0f19798081e652e813431092ba61b39ec2615945e1dd11c5d886ce792 Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.196096 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553696-9lrt8"] Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.208033 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-586c785596-k4qp7"] Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.216607 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-586c785596-k4qp7"] Mar 11 09:36:04 crc kubenswrapper[4830]: I0311 09:36:04.944761 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" path="/var/lib/kubelet/pods/af048295-8bc1-42cb-8f67-3049b2dc4215/volumes" Mar 11 09:36:05 crc kubenswrapper[4830]: I0311 09:36:05.151738 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerStarted","Data":"91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7"} Mar 11 09:36:05 crc kubenswrapper[4830]: I0311 09:36:05.154579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553696-9lrt8" event={"ID":"04bdf020-11b4-4125-ad87-0a30df4278b9","Type":"ContainerStarted","Data":"46da1fb0f19798081e652e813431092ba61b39ec2615945e1dd11c5d886ce792"} Mar 11 09:36:06 crc kubenswrapper[4830]: I0311 09:36:06.171925 4830 generic.go:334] "Generic (PLEG): container finished" podID="04bdf020-11b4-4125-ad87-0a30df4278b9" containerID="ec23955f74aa6bd8b3869586d4a33d78135366dabb6184a74677c328953c2e1b" exitCode=0 Mar 11 09:36:06 crc kubenswrapper[4830]: I0311 09:36:06.172060 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553696-9lrt8" event={"ID":"04bdf020-11b4-4125-ad87-0a30df4278b9","Type":"ContainerDied","Data":"ec23955f74aa6bd8b3869586d4a33d78135366dabb6184a74677c328953c2e1b"} Mar 11 09:36:06 crc kubenswrapper[4830]: I0311 09:36:06.176430 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerStarted","Data":"175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325"} Mar 11 09:36:07 crc kubenswrapper[4830]: I0311 09:36:07.186184 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerStarted","Data":"ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d"} Mar 11 09:36:07 crc kubenswrapper[4830]: I0311 09:36:07.546405 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553696-9lrt8" Mar 11 09:36:07 crc kubenswrapper[4830]: I0311 09:36:07.696859 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7454p\" (UniqueName: \"kubernetes.io/projected/04bdf020-11b4-4125-ad87-0a30df4278b9-kube-api-access-7454p\") pod \"04bdf020-11b4-4125-ad87-0a30df4278b9\" (UID: \"04bdf020-11b4-4125-ad87-0a30df4278b9\") " Mar 11 09:36:07 crc kubenswrapper[4830]: I0311 09:36:07.702052 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04bdf020-11b4-4125-ad87-0a30df4278b9-kube-api-access-7454p" (OuterVolumeSpecName: "kube-api-access-7454p") pod "04bdf020-11b4-4125-ad87-0a30df4278b9" (UID: "04bdf020-11b4-4125-ad87-0a30df4278b9"). InnerVolumeSpecName "kube-api-access-7454p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:07 crc kubenswrapper[4830]: I0311 09:36:07.799648 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7454p\" (UniqueName: \"kubernetes.io/projected/04bdf020-11b4-4125-ad87-0a30df4278b9-kube-api-access-7454p\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:08 crc kubenswrapper[4830]: I0311 09:36:08.210622 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553696-9lrt8" event={"ID":"04bdf020-11b4-4125-ad87-0a30df4278b9","Type":"ContainerDied","Data":"46da1fb0f19798081e652e813431092ba61b39ec2615945e1dd11c5d886ce792"} Mar 11 09:36:08 crc kubenswrapper[4830]: I0311 09:36:08.212009 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46da1fb0f19798081e652e813431092ba61b39ec2615945e1dd11c5d886ce792" Mar 11 09:36:08 crc kubenswrapper[4830]: I0311 09:36:08.210866 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553696-9lrt8" Mar 11 09:36:08 crc kubenswrapper[4830]: I0311 09:36:08.621340 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553690-86dps"] Mar 11 09:36:08 crc kubenswrapper[4830]: I0311 09:36:08.629493 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553690-86dps"] Mar 11 09:36:08 crc kubenswrapper[4830]: I0311 09:36:08.945490 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb8fb2d-321b-4489-92c2-5a314ae41dbf" path="/var/lib/kubelet/pods/adb8fb2d-321b-4489-92c2-5a314ae41dbf/volumes" Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.207486 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.207776 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-log" containerID="cri-o://2b540663e7c741945a9ec92868556e9360efff84fd073507c873423b1393f6a6" gracePeriod=30 Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.208269 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-httpd" containerID="cri-o://e1c2ee352a1aa9ad3a87217fdc9b26896b4b6b672422c1eb960b0604da915f64" gracePeriod=30 Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.231452 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerStarted","Data":"08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312"} Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.232534 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.232034 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="sg-core" containerID="cri-o://ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d" gracePeriod=30 Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.231633 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-central-agent" containerID="cri-o://91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7" gracePeriod=30 Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.232102 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-notification-agent" containerID="cri-o://175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325" gracePeriod=30 Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.232079 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="proxy-httpd" containerID="cri-o://08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312" gracePeriod=30 Mar 11 09:36:09 crc kubenswrapper[4830]: I0311 09:36:09.260410 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.363825734 podStartE2EDuration="11.260388971s" podCreationTimestamp="2026-03-11 09:35:58 +0000 UTC" firstStartedPulling="2026-03-11 09:36:04.079138423 +0000 UTC m=+1331.860289102" lastFinishedPulling="2026-03-11 09:36:07.97570165 +0000 UTC m=+1335.756852339" observedRunningTime="2026-03-11 09:36:09.250100636 +0000 UTC m=+1337.031251355" watchObservedRunningTime="2026-03-11 09:36:09.260388971 +0000 UTC m=+1337.041539660" Mar 11 09:36:09 crc kubenswrapper[4830]: E0311 09:36:09.307376 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65344ab0_3c56_4a1b_ac72_ef54fbf8da4a.slice/crio-conmon-2b540663e7c741945a9ec92868556e9360efff84fd073507c873423b1393f6a6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b227cb3_df95_4a38_b843_ff5cfe922fe1.slice/crio-ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.133997 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-t4g4z"] Mar 11 09:36:10 crc kubenswrapper[4830]: E0311 09:36:10.134685 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-api" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.134701 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-api" Mar 11 09:36:10 crc kubenswrapper[4830]: E0311 09:36:10.134721 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04bdf020-11b4-4125-ad87-0a30df4278b9" containerName="oc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.134728 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bdf020-11b4-4125-ad87-0a30df4278b9" containerName="oc" Mar 11 09:36:10 crc kubenswrapper[4830]: E0311 09:36:10.134749 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-httpd" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.134755 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-httpd" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.134928 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-api" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.134946 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="04bdf020-11b4-4125-ad87-0a30df4278b9" containerName="oc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.134963 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af048295-8bc1-42cb-8f67-3049b2dc4215" containerName="neutron-httpd" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.136111 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.181774 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t4g4z"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.230142 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gsqqs"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.231221 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.241088 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6d682a-4504-4159-852e-36c0e757a98c-operator-scripts\") pod \"nova-api-db-create-t4g4z\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.241255 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbqpw\" (UniqueName: \"kubernetes.io/projected/8b6d682a-4504-4159-852e-36c0e757a98c-kube-api-access-bbqpw\") pod \"nova-api-db-create-t4g4z\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.242176 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f109-account-create-update-xvvcg"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.243375 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.245314 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.260901 4830 generic.go:334] "Generic (PLEG): container finished" podID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerID="08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312" exitCode=0 Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.260938 4830 generic.go:334] "Generic (PLEG): container finished" podID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerID="ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d" exitCode=2 Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.260947 4830 generic.go:334] "Generic (PLEG): container finished" podID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerID="175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325" exitCode=0 Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.260990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerDied","Data":"08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312"} Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.261033 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerDied","Data":"ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d"} Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.261049 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerDied","Data":"175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325"} Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.267071 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gsqqs"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.274095 4830 generic.go:334] "Generic (PLEG): container finished" podID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerID="2b540663e7c741945a9ec92868556e9360efff84fd073507c873423b1393f6a6" exitCode=143 Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.274144 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a","Type":"ContainerDied","Data":"2b540663e7c741945a9ec92868556e9360efff84fd073507c873423b1393f6a6"} Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.284642 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f109-account-create-update-xvvcg"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.335654 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4dsqs"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.336981 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345491 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-operator-scripts\") pod \"nova-cell1-db-create-4dsqs\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345560 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbqpw\" (UniqueName: \"kubernetes.io/projected/8b6d682a-4504-4159-852e-36c0e757a98c-kube-api-access-bbqpw\") pod \"nova-api-db-create-t4g4z\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345650 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xh9\" (UniqueName: \"kubernetes.io/projected/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-kube-api-access-t2xh9\") pod \"nova-cell0-db-create-gsqqs\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345695 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6d682a-4504-4159-852e-36c0e757a98c-operator-scripts\") pod \"nova-api-db-create-t4g4z\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345720 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe70376b-be05-4aba-a39a-850335299924-operator-scripts\") pod \"nova-api-f109-account-create-update-xvvcg\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345744 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-operator-scripts\") pod \"nova-cell0-db-create-gsqqs\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345806 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4t7q\" (UniqueName: \"kubernetes.io/projected/fe70376b-be05-4aba-a39a-850335299924-kube-api-access-c4t7q\") pod \"nova-api-f109-account-create-update-xvvcg\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.345835 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6mb\" (UniqueName: \"kubernetes.io/projected/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-kube-api-access-rn6mb\") pod \"nova-cell1-db-create-4dsqs\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.346905 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4dsqs"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.347071 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6d682a-4504-4159-852e-36c0e757a98c-operator-scripts\") pod \"nova-api-db-create-t4g4z\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.417390 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbqpw\" (UniqueName: \"kubernetes.io/projected/8b6d682a-4504-4159-852e-36c0e757a98c-kube-api-access-bbqpw\") pod \"nova-api-db-create-t4g4z\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.447550 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4t7q\" (UniqueName: \"kubernetes.io/projected/fe70376b-be05-4aba-a39a-850335299924-kube-api-access-c4t7q\") pod \"nova-api-f109-account-create-update-xvvcg\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.447600 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6mb\" (UniqueName: \"kubernetes.io/projected/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-kube-api-access-rn6mb\") pod \"nova-cell1-db-create-4dsqs\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.447647 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-operator-scripts\") pod \"nova-cell1-db-create-4dsqs\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.447712 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xh9\" (UniqueName: \"kubernetes.io/projected/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-kube-api-access-t2xh9\") pod \"nova-cell0-db-create-gsqqs\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.447738 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe70376b-be05-4aba-a39a-850335299924-operator-scripts\") pod \"nova-api-f109-account-create-update-xvvcg\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.447754 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-operator-scripts\") pod \"nova-cell0-db-create-gsqqs\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.448397 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-operator-scripts\") pod \"nova-cell0-db-create-gsqqs\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.449242 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-operator-scripts\") pod \"nova-cell1-db-create-4dsqs\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.449807 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe70376b-be05-4aba-a39a-850335299924-operator-scripts\") pod \"nova-api-f109-account-create-update-xvvcg\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.449864 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2da6-account-create-update-8v4qc"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.451009 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.452481 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.453592 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.462538 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2da6-account-create-update-8v4qc"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.467429 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4t7q\" (UniqueName: \"kubernetes.io/projected/fe70376b-be05-4aba-a39a-850335299924-kube-api-access-c4t7q\") pod \"nova-api-f109-account-create-update-xvvcg\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.484152 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xh9\" (UniqueName: \"kubernetes.io/projected/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-kube-api-access-t2xh9\") pod \"nova-cell0-db-create-gsqqs\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.484842 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6mb\" (UniqueName: \"kubernetes.io/projected/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-kube-api-access-rn6mb\") pod \"nova-cell1-db-create-4dsqs\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.546180 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.549339 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-778m8\" (UniqueName: \"kubernetes.io/projected/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-kube-api-access-778m8\") pod \"nova-cell0-2da6-account-create-update-8v4qc\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.549445 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-operator-scripts\") pod \"nova-cell0-2da6-account-create-update-8v4qc\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.562702 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.648190 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-74ec-account-create-update-qnfw8"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.649451 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.651616 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641dc551-b5bc-455e-9deb-20542ef0ab9b-operator-scripts\") pod \"nova-cell1-74ec-account-create-update-qnfw8\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.651662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-778m8\" (UniqueName: \"kubernetes.io/projected/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-kube-api-access-778m8\") pod \"nova-cell0-2da6-account-create-update-8v4qc\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.651729 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-operator-scripts\") pod \"nova-cell0-2da6-account-create-update-8v4qc\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.651764 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnds\" (UniqueName: \"kubernetes.io/projected/641dc551-b5bc-455e-9deb-20542ef0ab9b-kube-api-access-8nnds\") pod \"nova-cell1-74ec-account-create-update-qnfw8\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.655384 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.656004 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-operator-scripts\") pod \"nova-cell0-2da6-account-create-update-8v4qc\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.659294 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.677140 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-74ec-account-create-update-qnfw8"] Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.679769 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-778m8\" (UniqueName: \"kubernetes.io/projected/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-kube-api-access-778m8\") pod \"nova-cell0-2da6-account-create-update-8v4qc\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.753517 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnds\" (UniqueName: \"kubernetes.io/projected/641dc551-b5bc-455e-9deb-20542ef0ab9b-kube-api-access-8nnds\") pod \"nova-cell1-74ec-account-create-update-qnfw8\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.753674 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641dc551-b5bc-455e-9deb-20542ef0ab9b-operator-scripts\") pod \"nova-cell1-74ec-account-create-update-qnfw8\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.754602 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641dc551-b5bc-455e-9deb-20542ef0ab9b-operator-scripts\") pod \"nova-cell1-74ec-account-create-update-qnfw8\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.777649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnds\" (UniqueName: \"kubernetes.io/projected/641dc551-b5bc-455e-9deb-20542ef0ab9b-kube-api-access-8nnds\") pod \"nova-cell1-74ec-account-create-update-qnfw8\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.846361 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.864398 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:10 crc kubenswrapper[4830]: I0311 09:36:10.984535 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t4g4z"] Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.143552 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gsqqs"] Mar 11 09:36:11 crc kubenswrapper[4830]: W0311 09:36:11.147923 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61a2e8d0_730f_4d64_ad5c_87e35dda7be9.slice/crio-3a6c68126f9b7c240e4d0002a8c799b6a26816797f4c8aa2b74ec101b64410d5 WatchSource:0}: Error finding container 3a6c68126f9b7c240e4d0002a8c799b6a26816797f4c8aa2b74ec101b64410d5: Status 404 returned error can't find the container with id 3a6c68126f9b7c240e4d0002a8c799b6a26816797f4c8aa2b74ec101b64410d5 Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.197370 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f109-account-create-update-xvvcg"] Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.207294 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4dsqs"] Mar 11 09:36:11 crc kubenswrapper[4830]: W0311 09:36:11.218547 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe70376b_be05_4aba_a39a_850335299924.slice/crio-a6c7646e716fc1151d7bf4a675feeef089e8814dcbaa0fa0461c673d1cb3ba69 WatchSource:0}: Error finding container a6c7646e716fc1151d7bf4a675feeef089e8814dcbaa0fa0461c673d1cb3ba69: Status 404 returned error can't find the container with id a6c7646e716fc1151d7bf4a675feeef089e8814dcbaa0fa0461c673d1cb3ba69 Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.345843 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f109-account-create-update-xvvcg" event={"ID":"fe70376b-be05-4aba-a39a-850335299924","Type":"ContainerStarted","Data":"a6c7646e716fc1151d7bf4a675feeef089e8814dcbaa0fa0461c673d1cb3ba69"} Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.350562 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gsqqs" event={"ID":"61a2e8d0-730f-4d64-ad5c-87e35dda7be9","Type":"ContainerStarted","Data":"3a6c68126f9b7c240e4d0002a8c799b6a26816797f4c8aa2b74ec101b64410d5"} Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.365584 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t4g4z" event={"ID":"8b6d682a-4504-4159-852e-36c0e757a98c","Type":"ContainerStarted","Data":"3208f82465fbac146679003c82468db5e6d21ea3115c4a6fb0124fea755e16bc"} Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.368207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4dsqs" event={"ID":"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d","Type":"ContainerStarted","Data":"1566ed56ef10e28454b6a11ca5990efd96004132ea325981b578cb9280862685"} Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.420812 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-74ec-account-create-update-qnfw8"] Mar 11 09:36:11 crc kubenswrapper[4830]: I0311 09:36:11.513829 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2da6-account-create-update-8v4qc"] Mar 11 09:36:11 crc kubenswrapper[4830]: W0311 09:36:11.528883 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95b1c23f_429e_4d5c_85d0_a6cfc1816ae0.slice/crio-4cfd46a2745cd264bdc2086cf9d9f296dc539b8996917e725002817f3e17f6ca WatchSource:0}: Error finding container 4cfd46a2745cd264bdc2086cf9d9f296dc539b8996917e725002817f3e17f6ca: Status 404 returned error can't find the container with id 4cfd46a2745cd264bdc2086cf9d9f296dc539b8996917e725002817f3e17f6ca Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.011528 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5f6b87df74-q5t2v" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.011647 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.377646 4830 generic.go:334] "Generic (PLEG): container finished" podID="641dc551-b5bc-455e-9deb-20542ef0ab9b" containerID="0d8fdb376611f5d58a9469459b7dc7d5122fc75221799e93480fd1daba7014b2" exitCode=0 Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.377824 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" event={"ID":"641dc551-b5bc-455e-9deb-20542ef0ab9b","Type":"ContainerDied","Data":"0d8fdb376611f5d58a9469459b7dc7d5122fc75221799e93480fd1daba7014b2"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.377912 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" event={"ID":"641dc551-b5bc-455e-9deb-20542ef0ab9b","Type":"ContainerStarted","Data":"837b27c9aa288a8c9f8e3e07c029bb8a91d547509fe719e70c6d9e62f966f413"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.379725 4830 generic.go:334] "Generic (PLEG): container finished" podID="95b1c23f-429e-4d5c-85d0-a6cfc1816ae0" containerID="e7bc08a3b17cbc5f85b1475599e86b823bd791b4e505f87a6808f116e94b420e" exitCode=0 Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.379765 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" event={"ID":"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0","Type":"ContainerDied","Data":"e7bc08a3b17cbc5f85b1475599e86b823bd791b4e505f87a6808f116e94b420e"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.379784 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" event={"ID":"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0","Type":"ContainerStarted","Data":"4cfd46a2745cd264bdc2086cf9d9f296dc539b8996917e725002817f3e17f6ca"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.381859 4830 generic.go:334] "Generic (PLEG): container finished" podID="9216b1e8-8423-4bb0-ac8e-c9c9f32e827d" containerID="8ca726f13d882498313f36c1d035d7b93adecd23cc546d081e01a0b18beb2ecd" exitCode=0 Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.381898 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4dsqs" event={"ID":"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d","Type":"ContainerDied","Data":"8ca726f13d882498313f36c1d035d7b93adecd23cc546d081e01a0b18beb2ecd"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.383172 4830 generic.go:334] "Generic (PLEG): container finished" podID="fe70376b-be05-4aba-a39a-850335299924" containerID="cf56e493f1cb721439ce09910e2edefc4bca70367610b1eaef3ea49bbbd618de" exitCode=0 Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.383227 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f109-account-create-update-xvvcg" event={"ID":"fe70376b-be05-4aba-a39a-850335299924","Type":"ContainerDied","Data":"cf56e493f1cb721439ce09910e2edefc4bca70367610b1eaef3ea49bbbd618de"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.384661 4830 generic.go:334] "Generic (PLEG): container finished" podID="61a2e8d0-730f-4d64-ad5c-87e35dda7be9" containerID="cd57baeb947c14b301bd8ef404f1f924755b9953c55251d099a0a982e7dc7463" exitCode=0 Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.384701 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gsqqs" event={"ID":"61a2e8d0-730f-4d64-ad5c-87e35dda7be9","Type":"ContainerDied","Data":"cd57baeb947c14b301bd8ef404f1f924755b9953c55251d099a0a982e7dc7463"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.386136 4830 generic.go:334] "Generic (PLEG): container finished" podID="8b6d682a-4504-4159-852e-36c0e757a98c" containerID="de47a77de53b1045546f55aa83ee6761d8f76ca7da73291f8e773734cb17a46b" exitCode=0 Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.386207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t4g4z" event={"ID":"8b6d682a-4504-4159-852e-36c0e757a98c","Type":"ContainerDied","Data":"de47a77de53b1045546f55aa83ee6761d8f76ca7da73291f8e773734cb17a46b"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.389184 4830 generic.go:334] "Generic (PLEG): container finished" podID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerID="e1c2ee352a1aa9ad3a87217fdc9b26896b4b6b672422c1eb960b0604da915f64" exitCode=0 Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.389219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a","Type":"ContainerDied","Data":"e1c2ee352a1aa9ad3a87217fdc9b26896b4b6b672422c1eb960b0604da915f64"} Mar 11 09:36:12 crc kubenswrapper[4830]: I0311 09:36:12.881806 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.010291 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-scripts\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.010495 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-internal-tls-certs\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.010548 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-httpd-run\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.011456 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-logs\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.011510 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-combined-ca-bundle\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.011543 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqlcm\" (UniqueName: \"kubernetes.io/projected/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-kube-api-access-mqlcm\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.011546 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.011587 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.011692 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-config-data\") pod \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\" (UID: \"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a\") " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.012606 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-logs" (OuterVolumeSpecName: "logs") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.013570 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.013601 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.016645 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.016668 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-scripts" (OuterVolumeSpecName: "scripts") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.016900 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-kube-api-access-mqlcm" (OuterVolumeSpecName: "kube-api-access-mqlcm") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "kube-api-access-mqlcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.038532 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.056537 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-config-data" (OuterVolumeSpecName: "config-data") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.060693 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.060839 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.061331 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" (UID: "65344ab0-3c56-4a1b-ac72-ef54fbf8da4a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.115728 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.115765 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.115780 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.115792 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqlcm\" (UniqueName: \"kubernetes.io/projected/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-kube-api-access-mqlcm\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.115817 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.115830 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.143899 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.218144 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.401881 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"65344ab0-3c56-4a1b-ac72-ef54fbf8da4a","Type":"ContainerDied","Data":"a274c53d5ceb4bd315c3002caff9cf7d224f34f5b58efd0471d0090f3e9935a8"} Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.401975 4830 scope.go:117] "RemoveContainer" containerID="e1c2ee352a1aa9ad3a87217fdc9b26896b4b6b672422c1eb960b0604da915f64" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.405318 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.449761 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.458712 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.473858 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:36:13 crc kubenswrapper[4830]: E0311 09:36:13.474387 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-httpd" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.474410 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-httpd" Mar 11 09:36:13 crc kubenswrapper[4830]: E0311 09:36:13.474428 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-log" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.474437 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-log" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.474655 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-httpd" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.474674 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" containerName="glance-log" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.475637 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.479063 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.479309 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.492534 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.531602 4830 scope.go:117] "RemoveContainer" containerID="2b540663e7c741945a9ec92868556e9360efff84fd073507c873423b1393f6a6" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.626926 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.627307 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.627422 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.627465 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.627512 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.627536 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldc76\" (UniqueName: \"kubernetes.io/projected/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-kube-api-access-ldc76\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.627582 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.627633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729167 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729242 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729284 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldc76\" (UniqueName: \"kubernetes.io/projected/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-kube-api-access-ldc76\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729326 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729386 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729450 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.729479 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.730891 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.731555 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.731639 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.735560 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.736293 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.736478 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.736511 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.750689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldc76\" (UniqueName: \"kubernetes.io/projected/a6069e67-6f76-4a02-9c90-d1ac74d8aaca-kube-api-access-ldc76\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.787304 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a6069e67-6f76-4a02-9c90-d1ac74d8aaca\") " pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.800299 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:13 crc kubenswrapper[4830]: I0311 09:36:13.945246 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.035125 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4t7q\" (UniqueName: \"kubernetes.io/projected/fe70376b-be05-4aba-a39a-850335299924-kube-api-access-c4t7q\") pod \"fe70376b-be05-4aba-a39a-850335299924\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.035440 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe70376b-be05-4aba-a39a-850335299924-operator-scripts\") pod \"fe70376b-be05-4aba-a39a-850335299924\" (UID: \"fe70376b-be05-4aba-a39a-850335299924\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.037931 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe70376b-be05-4aba-a39a-850335299924-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe70376b-be05-4aba-a39a-850335299924" (UID: "fe70376b-be05-4aba-a39a-850335299924"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.044579 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe70376b-be05-4aba-a39a-850335299924-kube-api-access-c4t7q" (OuterVolumeSpecName: "kube-api-access-c4t7q") pod "fe70376b-be05-4aba-a39a-850335299924" (UID: "fe70376b-be05-4aba-a39a-850335299924"). InnerVolumeSpecName "kube-api-access-c4t7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.144834 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4t7q\" (UniqueName: \"kubernetes.io/projected/fe70376b-be05-4aba-a39a-850335299924-kube-api-access-c4t7q\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.144977 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe70376b-be05-4aba-a39a-850335299924-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.160697 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.180547 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.195560 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.210900 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.220657 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.347893 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-operator-scripts\") pod \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.347942 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6d682a-4504-4159-852e-36c0e757a98c-operator-scripts\") pod \"8b6d682a-4504-4159-852e-36c0e757a98c\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.347967 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbqpw\" (UniqueName: \"kubernetes.io/projected/8b6d682a-4504-4159-852e-36c0e757a98c-kube-api-access-bbqpw\") pod \"8b6d682a-4504-4159-852e-36c0e757a98c\" (UID: \"8b6d682a-4504-4159-852e-36c0e757a98c\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.348083 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-operator-scripts\") pod \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.348144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn6mb\" (UniqueName: \"kubernetes.io/projected/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-kube-api-access-rn6mb\") pod \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.348165 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2xh9\" (UniqueName: \"kubernetes.io/projected/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-kube-api-access-t2xh9\") pod \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\" (UID: \"61a2e8d0-730f-4d64-ad5c-87e35dda7be9\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.348225 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nnds\" (UniqueName: \"kubernetes.io/projected/641dc551-b5bc-455e-9deb-20542ef0ab9b-kube-api-access-8nnds\") pod \"641dc551-b5bc-455e-9deb-20542ef0ab9b\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.348299 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641dc551-b5bc-455e-9deb-20542ef0ab9b-operator-scripts\") pod \"641dc551-b5bc-455e-9deb-20542ef0ab9b\" (UID: \"641dc551-b5bc-455e-9deb-20542ef0ab9b\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.348360 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-operator-scripts\") pod \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\" (UID: \"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.348390 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-778m8\" (UniqueName: \"kubernetes.io/projected/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-kube-api-access-778m8\") pod \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\" (UID: \"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0\") " Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.350404 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95b1c23f-429e-4d5c-85d0-a6cfc1816ae0" (UID: "95b1c23f-429e-4d5c-85d0-a6cfc1816ae0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.350467 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9216b1e8-8423-4bb0-ac8e-c9c9f32e827d" (UID: "9216b1e8-8423-4bb0-ac8e-c9c9f32e827d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.350465 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6d682a-4504-4159-852e-36c0e757a98c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b6d682a-4504-4159-852e-36c0e757a98c" (UID: "8b6d682a-4504-4159-852e-36c0e757a98c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.350963 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61a2e8d0-730f-4d64-ad5c-87e35dda7be9" (UID: "61a2e8d0-730f-4d64-ad5c-87e35dda7be9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.351491 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/641dc551-b5bc-455e-9deb-20542ef0ab9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "641dc551-b5bc-455e-9deb-20542ef0ab9b" (UID: "641dc551-b5bc-455e-9deb-20542ef0ab9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.353630 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-kube-api-access-rn6mb" (OuterVolumeSpecName: "kube-api-access-rn6mb") pod "9216b1e8-8423-4bb0-ac8e-c9c9f32e827d" (UID: "9216b1e8-8423-4bb0-ac8e-c9c9f32e827d"). InnerVolumeSpecName "kube-api-access-rn6mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.353804 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-kube-api-access-t2xh9" (OuterVolumeSpecName: "kube-api-access-t2xh9") pod "61a2e8d0-730f-4d64-ad5c-87e35dda7be9" (UID: "61a2e8d0-730f-4d64-ad5c-87e35dda7be9"). InnerVolumeSpecName "kube-api-access-t2xh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.354131 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-kube-api-access-778m8" (OuterVolumeSpecName: "kube-api-access-778m8") pod "95b1c23f-429e-4d5c-85d0-a6cfc1816ae0" (UID: "95b1c23f-429e-4d5c-85d0-a6cfc1816ae0"). InnerVolumeSpecName "kube-api-access-778m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.355133 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/641dc551-b5bc-455e-9deb-20542ef0ab9b-kube-api-access-8nnds" (OuterVolumeSpecName: "kube-api-access-8nnds") pod "641dc551-b5bc-455e-9deb-20542ef0ab9b" (UID: "641dc551-b5bc-455e-9deb-20542ef0ab9b"). InnerVolumeSpecName "kube-api-access-8nnds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.355192 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6d682a-4504-4159-852e-36c0e757a98c-kube-api-access-bbqpw" (OuterVolumeSpecName: "kube-api-access-bbqpw") pod "8b6d682a-4504-4159-852e-36c0e757a98c" (UID: "8b6d682a-4504-4159-852e-36c0e757a98c"). InnerVolumeSpecName "kube-api-access-bbqpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.413774 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" event={"ID":"641dc551-b5bc-455e-9deb-20542ef0ab9b","Type":"ContainerDied","Data":"837b27c9aa288a8c9f8e3e07c029bb8a91d547509fe719e70c6d9e62f966f413"} Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.415555 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="837b27c9aa288a8c9f8e3e07c029bb8a91d547509fe719e70c6d9e62f966f413" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.413824 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-74ec-account-create-update-qnfw8" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.419310 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.419316 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2da6-account-create-update-8v4qc" event={"ID":"95b1c23f-429e-4d5c-85d0-a6cfc1816ae0","Type":"ContainerDied","Data":"4cfd46a2745cd264bdc2086cf9d9f296dc539b8996917e725002817f3e17f6ca"} Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.419363 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cfd46a2745cd264bdc2086cf9d9f296dc539b8996917e725002817f3e17f6ca" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.422873 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4dsqs" event={"ID":"9216b1e8-8423-4bb0-ac8e-c9c9f32e827d","Type":"ContainerDied","Data":"1566ed56ef10e28454b6a11ca5990efd96004132ea325981b578cb9280862685"} Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.422885 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4dsqs" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.422914 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1566ed56ef10e28454b6a11ca5990efd96004132ea325981b578cb9280862685" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.424805 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f109-account-create-update-xvvcg" event={"ID":"fe70376b-be05-4aba-a39a-850335299924","Type":"ContainerDied","Data":"a6c7646e716fc1151d7bf4a675feeef089e8814dcbaa0fa0461c673d1cb3ba69"} Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.424840 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6c7646e716fc1151d7bf4a675feeef089e8814dcbaa0fa0461c673d1cb3ba69" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.424892 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f109-account-create-update-xvvcg" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.431637 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gsqqs" event={"ID":"61a2e8d0-730f-4d64-ad5c-87e35dda7be9","Type":"ContainerDied","Data":"3a6c68126f9b7c240e4d0002a8c799b6a26816797f4c8aa2b74ec101b64410d5"} Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.431659 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gsqqs" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.431669 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6c68126f9b7c240e4d0002a8c799b6a26816797f4c8aa2b74ec101b64410d5" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.435431 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t4g4z" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.435429 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t4g4z" event={"ID":"8b6d682a-4504-4159-852e-36c0e757a98c","Type":"ContainerDied","Data":"3208f82465fbac146679003c82468db5e6d21ea3115c4a6fb0124fea755e16bc"} Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.435487 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3208f82465fbac146679003c82468db5e6d21ea3115c4a6fb0124fea755e16bc" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450385 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nnds\" (UniqueName: \"kubernetes.io/projected/641dc551-b5bc-455e-9deb-20542ef0ab9b-kube-api-access-8nnds\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450705 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/641dc551-b5bc-455e-9deb-20542ef0ab9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450720 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450733 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-778m8\" (UniqueName: \"kubernetes.io/projected/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-kube-api-access-778m8\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450744 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450756 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b6d682a-4504-4159-852e-36c0e757a98c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450767 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbqpw\" (UniqueName: \"kubernetes.io/projected/8b6d682a-4504-4159-852e-36c0e757a98c-kube-api-access-bbqpw\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450779 4830 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450792 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn6mb\" (UniqueName: \"kubernetes.io/projected/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d-kube-api-access-rn6mb\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.450803 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2xh9\" (UniqueName: \"kubernetes.io/projected/61a2e8d0-730f-4d64-ad5c-87e35dda7be9-kube-api-access-t2xh9\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.565791 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 11 09:36:14 crc kubenswrapper[4830]: I0311 09:36:14.949082 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65344ab0-3c56-4a1b-ac72-ef54fbf8da4a" path="/var/lib/kubelet/pods/65344ab0-3c56-4a1b-ac72-ef54fbf8da4a/volumes" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.447282 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6069e67-6f76-4a02-9c90-d1ac74d8aaca","Type":"ContainerStarted","Data":"f176aa91e94d6e2c232859c841fb688410dab55f2dbefd3b51af4f97b7a1f527"} Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.447579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6069e67-6f76-4a02-9c90-d1ac74d8aaca","Type":"ContainerStarted","Data":"3569aa6d18770301901c9cd804674676179952fd7275e8a32e60f31e262b6c24"} Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.679224 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmx2j"] Mar 11 09:36:15 crc kubenswrapper[4830]: E0311 09:36:15.680932 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6d682a-4504-4159-852e-36c0e757a98c" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.680963 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6d682a-4504-4159-852e-36c0e757a98c" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: E0311 09:36:15.680986 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9216b1e8-8423-4bb0-ac8e-c9c9f32e827d" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.680994 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9216b1e8-8423-4bb0-ac8e-c9c9f32e827d" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: E0311 09:36:15.681012 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe70376b-be05-4aba-a39a-850335299924" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681039 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe70376b-be05-4aba-a39a-850335299924" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: E0311 09:36:15.681052 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="641dc551-b5bc-455e-9deb-20542ef0ab9b" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681060 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="641dc551-b5bc-455e-9deb-20542ef0ab9b" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: E0311 09:36:15.681083 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95b1c23f-429e-4d5c-85d0-a6cfc1816ae0" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681091 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b1c23f-429e-4d5c-85d0-a6cfc1816ae0" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: E0311 09:36:15.681109 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a2e8d0-730f-4d64-ad5c-87e35dda7be9" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681117 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a2e8d0-730f-4d64-ad5c-87e35dda7be9" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681353 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6d682a-4504-4159-852e-36c0e757a98c" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681375 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="95b1c23f-429e-4d5c-85d0-a6cfc1816ae0" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681388 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a2e8d0-730f-4d64-ad5c-87e35dda7be9" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681405 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="641dc551-b5bc-455e-9deb-20542ef0ab9b" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681423 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9216b1e8-8423-4bb0-ac8e-c9c9f32e827d" containerName="mariadb-database-create" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.681437 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe70376b-be05-4aba-a39a-850335299924" containerName="mariadb-account-create-update" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.682189 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.689164 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.689335 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.689483 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vd559" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.698208 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmx2j"] Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.779344 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cgr\" (UniqueName: \"kubernetes.io/projected/8f0cbfba-9a9b-43cd-8d56-b500764edebd-kube-api-access-68cgr\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.779422 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.779596 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-config-data\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.779646 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-scripts\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.880682 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.880797 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-config-data\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.880838 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-scripts\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.880920 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68cgr\" (UniqueName: \"kubernetes.io/projected/8f0cbfba-9a9b-43cd-8d56-b500764edebd-kube-api-access-68cgr\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.886002 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-scripts\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.886305 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.886344 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-config-data\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:15 crc kubenswrapper[4830]: I0311 09:36:15.899390 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cgr\" (UniqueName: \"kubernetes.io/projected/8f0cbfba-9a9b-43cd-8d56-b500764edebd-kube-api-access-68cgr\") pod \"nova-cell0-conductor-db-sync-fmx2j\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.089492 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.179991 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.202518 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.204460 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-log" containerID="cri-o://c17c3e29f31e18bb52beea83b03bc239545f7ca00d58d5b0c72578eb927d435a" gracePeriod=30 Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.204828 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-httpd" containerID="cri-o://c2440926a74fefbe03c6e61dd97818c5b356bcd468059234f6638c9e396999e2" gracePeriod=30 Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.293194 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-sg-core-conf-yaml\") pod \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.293508 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-config-data\") pod \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.293610 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-run-httpd\") pod \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.293646 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-log-httpd\") pod \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.293742 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-scripts\") pod \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.293767 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8hh8\" (UniqueName: \"kubernetes.io/projected/7b227cb3-df95-4a38-b843-ff5cfe922fe1-kube-api-access-w8hh8\") pod \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.293792 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-combined-ca-bundle\") pod \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\" (UID: \"7b227cb3-df95-4a38-b843-ff5cfe922fe1\") " Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.294828 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b227cb3-df95-4a38-b843-ff5cfe922fe1" (UID: "7b227cb3-df95-4a38-b843-ff5cfe922fe1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.295189 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b227cb3-df95-4a38-b843-ff5cfe922fe1" (UID: "7b227cb3-df95-4a38-b843-ff5cfe922fe1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.300376 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b227cb3-df95-4a38-b843-ff5cfe922fe1-kube-api-access-w8hh8" (OuterVolumeSpecName: "kube-api-access-w8hh8") pod "7b227cb3-df95-4a38-b843-ff5cfe922fe1" (UID: "7b227cb3-df95-4a38-b843-ff5cfe922fe1"). InnerVolumeSpecName "kube-api-access-w8hh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.300419 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-scripts" (OuterVolumeSpecName: "scripts") pod "7b227cb3-df95-4a38-b843-ff5cfe922fe1" (UID: "7b227cb3-df95-4a38-b843-ff5cfe922fe1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.326989 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b227cb3-df95-4a38-b843-ff5cfe922fe1" (UID: "7b227cb3-df95-4a38-b843-ff5cfe922fe1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.371174 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b227cb3-df95-4a38-b843-ff5cfe922fe1" (UID: "7b227cb3-df95-4a38-b843-ff5cfe922fe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.396453 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.396494 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b227cb3-df95-4a38-b843-ff5cfe922fe1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.396510 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.396523 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8hh8\" (UniqueName: \"kubernetes.io/projected/7b227cb3-df95-4a38-b843-ff5cfe922fe1-kube-api-access-w8hh8\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.396537 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.396546 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.413239 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-config-data" (OuterVolumeSpecName: "config-data") pod "7b227cb3-df95-4a38-b843-ff5cfe922fe1" (UID: "7b227cb3-df95-4a38-b843-ff5cfe922fe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.458549 4830 generic.go:334] "Generic (PLEG): container finished" podID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerID="c17c3e29f31e18bb52beea83b03bc239545f7ca00d58d5b0c72578eb927d435a" exitCode=143 Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.458613 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e4a85e2-f1a7-4463-8795-55508a60df90","Type":"ContainerDied","Data":"c17c3e29f31e18bb52beea83b03bc239545f7ca00d58d5b0c72578eb927d435a"} Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.462042 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6069e67-6f76-4a02-9c90-d1ac74d8aaca","Type":"ContainerStarted","Data":"2b8331b996b744834424d8ca71195f213d061e9c2b86f37ce35fb1621f65039d"} Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.468073 4830 generic.go:334] "Generic (PLEG): container finished" podID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerID="91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7" exitCode=0 Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.468119 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerDied","Data":"91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7"} Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.468151 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b227cb3-df95-4a38-b843-ff5cfe922fe1","Type":"ContainerDied","Data":"ab35d7d7cc735455d1080123cc5a3d101fff16e4fcf6ea307c98be4eab3f29c1"} Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.468172 4830 scope.go:117] "RemoveContainer" containerID="08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.468350 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.497587 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b227cb3-df95-4a38-b843-ff5cfe922fe1-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.507986 4830 scope.go:117] "RemoveContainer" containerID="ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.527958 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.527919743 podStartE2EDuration="3.527919743s" podCreationTimestamp="2026-03-11 09:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:36:16.496340406 +0000 UTC m=+1344.277491095" watchObservedRunningTime="2026-03-11 09:36:16.527919743 +0000 UTC m=+1344.309070432" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.541964 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.560603 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.561441 4830 scope.go:117] "RemoveContainer" containerID="175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.607383 4830 scope.go:117] "RemoveContainer" containerID="91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.632581 4830 scope.go:117] "RemoveContainer" containerID="08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.644686 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.645457 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-notification-agent" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645473 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-notification-agent" Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.645490 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="sg-core" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645496 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="sg-core" Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.645507 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-central-agent" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645514 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-central-agent" Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.645543 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="proxy-httpd" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645549 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="proxy-httpd" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645726 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="proxy-httpd" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645738 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="sg-core" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645757 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-notification-agent" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.645773 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" containerName="ceilometer-central-agent" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.648306 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.652200 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312\": container with ID starting with 08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312 not found: ID does not exist" containerID="08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.652271 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312"} err="failed to get container status \"08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312\": rpc error: code = NotFound desc = could not find container \"08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312\": container with ID starting with 08c93b68565c60bdaa20960b082c82bce8f87f4da68b89bf05158216ca5a4312 not found: ID does not exist" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.652319 4830 scope.go:117] "RemoveContainer" containerID="ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.652461 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.652655 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.667096 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d\": container with ID starting with ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d not found: ID does not exist" containerID="ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.667186 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d"} err="failed to get container status \"ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d\": rpc error: code = NotFound desc = could not find container \"ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d\": container with ID starting with ebe79f473e778585d9fdd29334128cb397d006b26ae3b82db9516e504b40210d not found: ID does not exist" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.667224 4830 scope.go:117] "RemoveContainer" containerID="175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325" Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.667601 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325\": container with ID starting with 175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325 not found: ID does not exist" containerID="175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.667643 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325"} err="failed to get container status \"175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325\": rpc error: code = NotFound desc = could not find container \"175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325\": container with ID starting with 175c1c32e69ab8b07d55818c73bd2a482be9daaed9bfd2c97cfd77c609015325 not found: ID does not exist" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.667667 4830 scope.go:117] "RemoveContainer" containerID="91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7" Mar 11 09:36:16 crc kubenswrapper[4830]: E0311 09:36:16.667940 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7\": container with ID starting with 91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7 not found: ID does not exist" containerID="91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.667968 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7"} err="failed to get container status \"91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7\": rpc error: code = NotFound desc = could not find container \"91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7\": container with ID starting with 91a68c2822f2ea88a3e0edc32301dde6c3a91e27fee46d18c4faf83ed911bfd7 not found: ID does not exist" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.672627 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.697718 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmx2j"] Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.805746 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-run-httpd\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.805851 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.805886 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.805941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-log-httpd\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.805980 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-scripts\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.806095 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbvb9\" (UniqueName: \"kubernetes.io/projected/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-kube-api-access-fbvb9\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.806128 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-config-data\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.908965 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.909069 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.909142 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-log-httpd\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.909194 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-scripts\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.909240 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbvb9\" (UniqueName: \"kubernetes.io/projected/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-kube-api-access-fbvb9\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.909283 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-config-data\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.909344 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-run-httpd\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.909989 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-run-httpd\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.910203 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-log-httpd\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.916874 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.917337 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-scripts\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.917931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.931824 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbvb9\" (UniqueName: \"kubernetes.io/projected/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-kube-api-access-fbvb9\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.932495 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-config-data\") pod \"ceilometer-0\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " pod="openstack/ceilometer-0" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.953714 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b227cb3-df95-4a38-b843-ff5cfe922fe1" path="/var/lib/kubelet/pods/7b227cb3-df95-4a38-b843-ff5cfe922fe1/volumes" Mar 11 09:36:16 crc kubenswrapper[4830]: I0311 09:36:16.985533 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.188753 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.326709 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-tls-certs\") pod \"242c5a27-bc92-42f0-b630-6d1f3cd55822\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.326851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242c5a27-bc92-42f0-b630-6d1f3cd55822-logs\") pod \"242c5a27-bc92-42f0-b630-6d1f3cd55822\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.326899 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-scripts\") pod \"242c5a27-bc92-42f0-b630-6d1f3cd55822\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.326960 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlwps\" (UniqueName: \"kubernetes.io/projected/242c5a27-bc92-42f0-b630-6d1f3cd55822-kube-api-access-rlwps\") pod \"242c5a27-bc92-42f0-b630-6d1f3cd55822\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.326991 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-combined-ca-bundle\") pod \"242c5a27-bc92-42f0-b630-6d1f3cd55822\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.327045 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-config-data\") pod \"242c5a27-bc92-42f0-b630-6d1f3cd55822\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.327124 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-secret-key\") pod \"242c5a27-bc92-42f0-b630-6d1f3cd55822\" (UID: \"242c5a27-bc92-42f0-b630-6d1f3cd55822\") " Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.328077 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/242c5a27-bc92-42f0-b630-6d1f3cd55822-logs" (OuterVolumeSpecName: "logs") pod "242c5a27-bc92-42f0-b630-6d1f3cd55822" (UID: "242c5a27-bc92-42f0-b630-6d1f3cd55822"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.334811 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "242c5a27-bc92-42f0-b630-6d1f3cd55822" (UID: "242c5a27-bc92-42f0-b630-6d1f3cd55822"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.337829 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242c5a27-bc92-42f0-b630-6d1f3cd55822-kube-api-access-rlwps" (OuterVolumeSpecName: "kube-api-access-rlwps") pod "242c5a27-bc92-42f0-b630-6d1f3cd55822" (UID: "242c5a27-bc92-42f0-b630-6d1f3cd55822"). InnerVolumeSpecName "kube-api-access-rlwps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.362269 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-config-data" (OuterVolumeSpecName: "config-data") pod "242c5a27-bc92-42f0-b630-6d1f3cd55822" (UID: "242c5a27-bc92-42f0-b630-6d1f3cd55822"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.375453 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "242c5a27-bc92-42f0-b630-6d1f3cd55822" (UID: "242c5a27-bc92-42f0-b630-6d1f3cd55822"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.380681 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-scripts" (OuterVolumeSpecName: "scripts") pod "242c5a27-bc92-42f0-b630-6d1f3cd55822" (UID: "242c5a27-bc92-42f0-b630-6d1f3cd55822"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.394136 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "242c5a27-bc92-42f0-b630-6d1f3cd55822" (UID: "242c5a27-bc92-42f0-b630-6d1f3cd55822"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.430125 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.430493 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlwps\" (UniqueName: \"kubernetes.io/projected/242c5a27-bc92-42f0-b630-6d1f3cd55822-kube-api-access-rlwps\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.430507 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.430516 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/242c5a27-bc92-42f0-b630-6d1f3cd55822-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.430526 4830 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.430536 4830 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/242c5a27-bc92-42f0-b630-6d1f3cd55822-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.430565 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/242c5a27-bc92-42f0-b630-6d1f3cd55822-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.484190 4830 generic.go:334] "Generic (PLEG): container finished" podID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerID="af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10" exitCode=137 Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.484237 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6b87df74-q5t2v" event={"ID":"242c5a27-bc92-42f0-b630-6d1f3cd55822","Type":"ContainerDied","Data":"af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10"} Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.484297 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6b87df74-q5t2v" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.484318 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6b87df74-q5t2v" event={"ID":"242c5a27-bc92-42f0-b630-6d1f3cd55822","Type":"ContainerDied","Data":"fe95df3fb45002513d42a2ac462809f4eaa5ba7dcd9f05458b08da62b1a8d992"} Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.484343 4830 scope.go:117] "RemoveContainer" containerID="e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.485944 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" event={"ID":"8f0cbfba-9a9b-43cd-8d56-b500764edebd","Type":"ContainerStarted","Data":"d846ebcf6e143921cf84dbc1fdbae942e2ec2ce82a3ba124924fa82a28d5b554"} Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.536132 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f6b87df74-q5t2v"] Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.551199 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f6b87df74-q5t2v"] Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.560006 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.700927 4830 scope.go:117] "RemoveContainer" containerID="af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10" Mar 11 09:36:17 crc kubenswrapper[4830]: W0311 09:36:17.710391 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6485d2e_1d7a_4b8e_a4ba_9985cec52aae.slice/crio-58c125f1492262ef984be51413d645e5cf2c062b6f16063a9467a9cc5a71dba0 WatchSource:0}: Error finding container 58c125f1492262ef984be51413d645e5cf2c062b6f16063a9467a9cc5a71dba0: Status 404 returned error can't find the container with id 58c125f1492262ef984be51413d645e5cf2c062b6f16063a9467a9cc5a71dba0 Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.747875 4830 scope.go:117] "RemoveContainer" containerID="e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8" Mar 11 09:36:17 crc kubenswrapper[4830]: E0311 09:36:17.748706 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8\": container with ID starting with e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8 not found: ID does not exist" containerID="e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.748772 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8"} err="failed to get container status \"e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8\": rpc error: code = NotFound desc = could not find container \"e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8\": container with ID starting with e75513e7fa575d1ba3ab46eb00f2283a52dbd62e176d6fdf8dab252b1c5b2bd8 not found: ID does not exist" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.748814 4830 scope.go:117] "RemoveContainer" containerID="af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10" Mar 11 09:36:17 crc kubenswrapper[4830]: E0311 09:36:17.749345 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10\": container with ID starting with af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10 not found: ID does not exist" containerID="af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10" Mar 11 09:36:17 crc kubenswrapper[4830]: I0311 09:36:17.749423 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10"} err="failed to get container status \"af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10\": rpc error: code = NotFound desc = could not find container \"af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10\": container with ID starting with af0af9758829a86257c22c0fe0ff102d0952b4520d8a902892000c1a8c9e1e10 not found: ID does not exist" Mar 11 09:36:18 crc kubenswrapper[4830]: I0311 09:36:18.507736 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerStarted","Data":"58c125f1492262ef984be51413d645e5cf2c062b6f16063a9467a9cc5a71dba0"} Mar 11 09:36:18 crc kubenswrapper[4830]: I0311 09:36:18.943954 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" path="/var/lib/kubelet/pods/242c5a27-bc92-42f0-b630-6d1f3cd55822/volumes" Mar 11 09:36:19 crc kubenswrapper[4830]: I0311 09:36:19.534635 4830 generic.go:334] "Generic (PLEG): container finished" podID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerID="c2440926a74fefbe03c6e61dd97818c5b356bcd468059234f6638c9e396999e2" exitCode=0 Mar 11 09:36:19 crc kubenswrapper[4830]: I0311 09:36:19.534674 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e4a85e2-f1a7-4463-8795-55508a60df90","Type":"ContainerDied","Data":"c2440926a74fefbe03c6e61dd97818c5b356bcd468059234f6638c9e396999e2"} Mar 11 09:36:19 crc kubenswrapper[4830]: I0311 09:36:19.538063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerStarted","Data":"38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94"} Mar 11 09:36:19 crc kubenswrapper[4830]: I0311 09:36:19.538259 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerStarted","Data":"4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8"} Mar 11 09:36:19 crc kubenswrapper[4830]: E0311 09:36:19.555074 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e4a85e2_f1a7_4463_8795_55508a60df90.slice/crio-conmon-c2440926a74fefbe03c6e61dd97818c5b356bcd468059234f6638c9e396999e2.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:36:19 crc kubenswrapper[4830]: I0311 09:36:19.952036 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.077618 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-logs\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.077695 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-scripts\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.077745 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-public-tls-certs\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.077804 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmt97\" (UniqueName: \"kubernetes.io/projected/0e4a85e2-f1a7-4463-8795-55508a60df90-kube-api-access-qmt97\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.077854 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-combined-ca-bundle\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.077905 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-httpd-run\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.077925 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-config-data\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.078009 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0e4a85e2-f1a7-4463-8795-55508a60df90\" (UID: \"0e4a85e2-f1a7-4463-8795-55508a60df90\") " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.078325 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-logs" (OuterVolumeSpecName: "logs") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.078879 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.079480 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.086167 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-scripts" (OuterVolumeSpecName: "scripts") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.095760 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.097031 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e4a85e2-f1a7-4463-8795-55508a60df90-kube-api-access-qmt97" (OuterVolumeSpecName: "kube-api-access-qmt97") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "kube-api-access-qmt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.116526 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.153279 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.160436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-config-data" (OuterVolumeSpecName: "config-data") pod "0e4a85e2-f1a7-4463-8795-55508a60df90" (UID: "0e4a85e2-f1a7-4463-8795-55508a60df90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.181671 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.181739 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.181756 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmt97\" (UniqueName: \"kubernetes.io/projected/0e4a85e2-f1a7-4463-8795-55508a60df90-kube-api-access-qmt97\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.181767 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.181776 4830 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e4a85e2-f1a7-4463-8795-55508a60df90-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.181785 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e4a85e2-f1a7-4463-8795-55508a60df90-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.181830 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.209937 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.284221 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.548244 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e4a85e2-f1a7-4463-8795-55508a60df90","Type":"ContainerDied","Data":"da1881f79daaf08d17b8939b6424cf94d32e1f688d2cdfbd6425b78891405c5c"} Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.548293 4830 scope.go:117] "RemoveContainer" containerID="c2440926a74fefbe03c6e61dd97818c5b356bcd468059234f6638c9e396999e2" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.548405 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.581803 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.589672 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613122 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:36:20 crc kubenswrapper[4830]: E0311 09:36:20.613485 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613500 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" Mar 11 09:36:20 crc kubenswrapper[4830]: E0311 09:36:20.613521 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-httpd" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613529 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-httpd" Mar 11 09:36:20 crc kubenswrapper[4830]: E0311 09:36:20.613549 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-log" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613554 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-log" Mar 11 09:36:20 crc kubenswrapper[4830]: E0311 09:36:20.613568 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon-log" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613573 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon-log" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613743 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-log" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613755 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon-log" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613768 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" containerName="glance-httpd" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.613778 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="242c5a27-bc92-42f0-b630-6d1f3cd55822" containerName="horizon" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.614687 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.618219 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.622739 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.633484 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691553 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691617 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691654 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691685 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691776 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c8947c5-6c54-4acb-9100-3c5ea0988770-logs\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691829 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691913 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7l8t\" (UniqueName: \"kubernetes.io/projected/4c8947c5-6c54-4acb-9100-3c5ea0988770-kube-api-access-n7l8t\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.691969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c8947c5-6c54-4acb-9100-3c5ea0988770-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796516 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7l8t\" (UniqueName: \"kubernetes.io/projected/4c8947c5-6c54-4acb-9100-3c5ea0988770-kube-api-access-n7l8t\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796567 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c8947c5-6c54-4acb-9100-3c5ea0988770-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796605 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796634 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796658 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796686 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796733 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c8947c5-6c54-4acb-9100-3c5ea0988770-logs\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.796815 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.797213 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c8947c5-6c54-4acb-9100-3c5ea0988770-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.797245 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c8947c5-6c54-4acb-9100-3c5ea0988770-logs\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.801695 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.801741 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.802141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.803708 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c8947c5-6c54-4acb-9100-3c5ea0988770-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.819551 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7l8t\" (UniqueName: \"kubernetes.io/projected/4c8947c5-6c54-4acb-9100-3c5ea0988770-kube-api-access-n7l8t\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.826951 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"4c8947c5-6c54-4acb-9100-3c5ea0988770\") " pod="openstack/glance-default-external-api-0" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.945468 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e4a85e2-f1a7-4463-8795-55508a60df90" path="/var/lib/kubelet/pods/0e4a85e2-f1a7-4463-8795-55508a60df90/volumes" Mar 11 09:36:20 crc kubenswrapper[4830]: I0311 09:36:20.963690 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 11 09:36:23 crc kubenswrapper[4830]: I0311 09:36:23.801138 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:23 crc kubenswrapper[4830]: I0311 09:36:23.801717 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:23 crc kubenswrapper[4830]: I0311 09:36:23.844738 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:23 crc kubenswrapper[4830]: I0311 09:36:23.856122 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:24 crc kubenswrapper[4830]: I0311 09:36:24.594335 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:24 crc kubenswrapper[4830]: I0311 09:36:24.594683 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:24 crc kubenswrapper[4830]: I0311 09:36:24.904186 4830 scope.go:117] "RemoveContainer" containerID="c17c3e29f31e18bb52beea83b03bc239545f7ca00d58d5b0c72578eb927d435a" Mar 11 09:36:25 crc kubenswrapper[4830]: I0311 09:36:25.497079 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 11 09:36:25 crc kubenswrapper[4830]: W0311 09:36:25.507643 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c8947c5_6c54_4acb_9100_3c5ea0988770.slice/crio-fc6974b44572bb22e740455f8033e92a8141b85a3190f9aed74e89fee90be74d WatchSource:0}: Error finding container fc6974b44572bb22e740455f8033e92a8141b85a3190f9aed74e89fee90be74d: Status 404 returned error can't find the container with id fc6974b44572bb22e740455f8033e92a8141b85a3190f9aed74e89fee90be74d Mar 11 09:36:25 crc kubenswrapper[4830]: I0311 09:36:25.604324 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c8947c5-6c54-4acb-9100-3c5ea0988770","Type":"ContainerStarted","Data":"fc6974b44572bb22e740455f8033e92a8141b85a3190f9aed74e89fee90be74d"} Mar 11 09:36:25 crc kubenswrapper[4830]: I0311 09:36:25.606536 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" event={"ID":"8f0cbfba-9a9b-43cd-8d56-b500764edebd","Type":"ContainerStarted","Data":"db59682059ad5d3852437c009d1ec91a3b7d8c392986953cfd586a2f8968ab27"} Mar 11 09:36:25 crc kubenswrapper[4830]: I0311 09:36:25.618580 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerStarted","Data":"7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e"} Mar 11 09:36:25 crc kubenswrapper[4830]: I0311 09:36:25.626365 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" podStartSLOduration=2.25086393 podStartE2EDuration="10.626338669s" podCreationTimestamp="2026-03-11 09:36:15 +0000 UTC" firstStartedPulling="2026-03-11 09:36:16.617262221 +0000 UTC m=+1344.398412910" lastFinishedPulling="2026-03-11 09:36:24.99273696 +0000 UTC m=+1352.773887649" observedRunningTime="2026-03-11 09:36:25.624636992 +0000 UTC m=+1353.405787691" watchObservedRunningTime="2026-03-11 09:36:25.626338669 +0000 UTC m=+1353.407489358" Mar 11 09:36:26 crc kubenswrapper[4830]: I0311 09:36:26.642041 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c8947c5-6c54-4acb-9100-3c5ea0988770","Type":"ContainerStarted","Data":"8e787a2a6352f26dbacb7579d8a1b034a77dd651a1554037e505dec07861529f"} Mar 11 09:36:26 crc kubenswrapper[4830]: I0311 09:36:26.832751 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:26 crc kubenswrapper[4830]: I0311 09:36:26.832859 4830 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 11 09:36:26 crc kubenswrapper[4830]: I0311 09:36:26.842726 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 11 09:36:27 crc kubenswrapper[4830]: I0311 09:36:27.655788 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c8947c5-6c54-4acb-9100-3c5ea0988770","Type":"ContainerStarted","Data":"59ae7acb93dba6b84f92c7ea5fba70542c727d49335acfcc37e0ec2754ec2cac"} Mar 11 09:36:27 crc kubenswrapper[4830]: I0311 09:36:27.689876 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.68985977 podStartE2EDuration="7.68985977s" podCreationTimestamp="2026-03-11 09:36:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:36:27.687409042 +0000 UTC m=+1355.468559771" watchObservedRunningTime="2026-03-11 09:36:27.68985977 +0000 UTC m=+1355.471010459" Mar 11 09:36:29 crc kubenswrapper[4830]: I0311 09:36:29.683974 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerStarted","Data":"e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50"} Mar 11 09:36:29 crc kubenswrapper[4830]: I0311 09:36:29.684452 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 09:36:29 crc kubenswrapper[4830]: I0311 09:36:29.707655 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.851004013 podStartE2EDuration="13.707635881s" podCreationTimestamp="2026-03-11 09:36:16 +0000 UTC" firstStartedPulling="2026-03-11 09:36:17.713870695 +0000 UTC m=+1345.495021384" lastFinishedPulling="2026-03-11 09:36:28.570502563 +0000 UTC m=+1356.351653252" observedRunningTime="2026-03-11 09:36:29.700930805 +0000 UTC m=+1357.482081504" watchObservedRunningTime="2026-03-11 09:36:29.707635881 +0000 UTC m=+1357.488786560" Mar 11 09:36:30 crc kubenswrapper[4830]: I0311 09:36:30.964388 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 09:36:30 crc kubenswrapper[4830]: I0311 09:36:30.965138 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 11 09:36:30 crc kubenswrapper[4830]: I0311 09:36:30.997782 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.025226 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.330452 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.701634 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-central-agent" containerID="cri-o://4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8" gracePeriod=30 Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.701709 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="sg-core" containerID="cri-o://7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e" gracePeriod=30 Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.702060 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.702099 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.702005 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-notification-agent" containerID="cri-o://38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94" gracePeriod=30 Mar 11 09:36:31 crc kubenswrapper[4830]: I0311 09:36:31.702653 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="proxy-httpd" containerID="cri-o://e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50" gracePeriod=30 Mar 11 09:36:32 crc kubenswrapper[4830]: I0311 09:36:32.712626 4830 generic.go:334] "Generic (PLEG): container finished" podID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerID="e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50" exitCode=0 Mar 11 09:36:32 crc kubenswrapper[4830]: I0311 09:36:32.712914 4830 generic.go:334] "Generic (PLEG): container finished" podID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerID="7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e" exitCode=2 Mar 11 09:36:32 crc kubenswrapper[4830]: I0311 09:36:32.712928 4830 generic.go:334] "Generic (PLEG): container finished" podID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerID="4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8" exitCode=0 Mar 11 09:36:32 crc kubenswrapper[4830]: I0311 09:36:32.712710 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerDied","Data":"e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50"} Mar 11 09:36:32 crc kubenswrapper[4830]: I0311 09:36:32.713045 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerDied","Data":"7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e"} Mar 11 09:36:32 crc kubenswrapper[4830]: I0311 09:36:32.713062 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerDied","Data":"4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8"} Mar 11 09:36:33 crc kubenswrapper[4830]: I0311 09:36:33.621012 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 09:36:33 crc kubenswrapper[4830]: I0311 09:36:33.624919 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.421389 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.508446 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-scripts\") pod \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.508562 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-sg-core-conf-yaml\") pod \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.508639 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-run-httpd\") pod \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.508690 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-config-data\") pod \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.508722 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbvb9\" (UniqueName: \"kubernetes.io/projected/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-kube-api-access-fbvb9\") pod \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.508738 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-combined-ca-bundle\") pod \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.508780 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-log-httpd\") pod \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\" (UID: \"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae\") " Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.509397 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" (UID: "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.510290 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" (UID: "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.516582 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-kube-api-access-fbvb9" (OuterVolumeSpecName: "kube-api-access-fbvb9") pod "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" (UID: "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae"). InnerVolumeSpecName "kube-api-access-fbvb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.518221 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-scripts" (OuterVolumeSpecName: "scripts") pod "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" (UID: "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.545873 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" (UID: "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.597042 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" (UID: "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.610463 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbvb9\" (UniqueName: \"kubernetes.io/projected/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-kube-api-access-fbvb9\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.610493 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.610502 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.610511 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.610519 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.610527 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.625074 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-config-data" (OuterVolumeSpecName: "config-data") pod "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" (UID: "a6485d2e-1d7a-4b8e-a4ba-9985cec52aae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.711732 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.733522 4830 generic.go:334] "Generic (PLEG): container finished" podID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerID="38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94" exitCode=0 Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.733659 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.733715 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerDied","Data":"38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94"} Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.733751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6485d2e-1d7a-4b8e-a4ba-9985cec52aae","Type":"ContainerDied","Data":"58c125f1492262ef984be51413d645e5cf2c062b6f16063a9467a9cc5a71dba0"} Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.733773 4830 scope.go:117] "RemoveContainer" containerID="e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.759593 4830 scope.go:117] "RemoveContainer" containerID="7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.779670 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.783709 4830 scope.go:117] "RemoveContainer" containerID="38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.786071 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815009 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.815414 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="sg-core" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815436 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="sg-core" Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.815455 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-central-agent" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815464 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-central-agent" Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.815498 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="proxy-httpd" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815506 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="proxy-httpd" Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.815521 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-notification-agent" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815529 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-notification-agent" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815721 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="proxy-httpd" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815750 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="sg-core" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815767 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-notification-agent" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.815788 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" containerName="ceilometer-central-agent" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.822786 4830 scope.go:117] "RemoveContainer" containerID="4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.837610 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.837721 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.840610 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.841953 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.870613 4830 scope.go:117] "RemoveContainer" containerID="e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50" Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.870926 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50\": container with ID starting with e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50 not found: ID does not exist" containerID="e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.870966 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50"} err="failed to get container status \"e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50\": rpc error: code = NotFound desc = could not find container \"e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50\": container with ID starting with e5b2897754a1e6242070c186fb818493b2b59039139f1921fe7b09746a7d3a50 not found: ID does not exist" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.870987 4830 scope.go:117] "RemoveContainer" containerID="7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e" Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.871214 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e\": container with ID starting with 7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e not found: ID does not exist" containerID="7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.871242 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e"} err="failed to get container status \"7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e\": rpc error: code = NotFound desc = could not find container \"7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e\": container with ID starting with 7f6b941ec8cc107a8ac1a9ba7e16195ba58ef8fb61a6fa0e337eea1c33a9cc7e not found: ID does not exist" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.871257 4830 scope.go:117] "RemoveContainer" containerID="38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94" Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.871592 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94\": container with ID starting with 38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94 not found: ID does not exist" containerID="38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.871617 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94"} err="failed to get container status \"38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94\": rpc error: code = NotFound desc = could not find container \"38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94\": container with ID starting with 38725d267d1787ba6e78b54c68ec717d6c32c1e3b904d8deee171634df24eb94 not found: ID does not exist" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.871632 4830 scope.go:117] "RemoveContainer" containerID="4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8" Mar 11 09:36:34 crc kubenswrapper[4830]: E0311 09:36:34.872130 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8\": container with ID starting with 4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8 not found: ID does not exist" containerID="4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.872152 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8"} err="failed to get container status \"4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8\": rpc error: code = NotFound desc = could not find container \"4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8\": container with ID starting with 4665c7fdf344b7d9300298857bd376e45443f5381aff6c25f592f752e16d80c8 not found: ID does not exist" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.915421 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.915466 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-config-data\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.915552 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d9t7\" (UniqueName: \"kubernetes.io/projected/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-kube-api-access-5d9t7\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.915607 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-run-httpd\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.915735 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.915771 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-log-httpd\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.915792 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-scripts\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:34 crc kubenswrapper[4830]: I0311 09:36:34.943542 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6485d2e-1d7a-4b8e-a4ba-9985cec52aae" path="/var/lib/kubelet/pods/a6485d2e-1d7a-4b8e-a4ba-9985cec52aae/volumes" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017360 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d9t7\" (UniqueName: \"kubernetes.io/projected/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-kube-api-access-5d9t7\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017411 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-run-httpd\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017489 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017839 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-log-httpd\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017872 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-scripts\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017954 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017946 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-run-httpd\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.017985 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-config-data\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.018262 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-log-httpd\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.021837 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.022620 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-scripts\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.023850 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-config-data\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.026799 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.042466 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d9t7\" (UniqueName: \"kubernetes.io/projected/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-kube-api-access-5d9t7\") pod \"ceilometer-0\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.166463 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.611027 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:36:35 crc kubenswrapper[4830]: W0311 09:36:35.614590 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71af19d_ca84_46f9_94d1_7ff4a6b1f861.slice/crio-738c51d8ee94576774ae5f28a9e0058f2bcc38d1c10c400da4cf079dcd6bd1f1 WatchSource:0}: Error finding container 738c51d8ee94576774ae5f28a9e0058f2bcc38d1c10c400da4cf079dcd6bd1f1: Status 404 returned error can't find the container with id 738c51d8ee94576774ae5f28a9e0058f2bcc38d1c10c400da4cf079dcd6bd1f1 Mar 11 09:36:35 crc kubenswrapper[4830]: I0311 09:36:35.745295 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerStarted","Data":"738c51d8ee94576774ae5f28a9e0058f2bcc38d1c10c400da4cf079dcd6bd1f1"} Mar 11 09:36:37 crc kubenswrapper[4830]: I0311 09:36:37.769322 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerStarted","Data":"9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f"} Mar 11 09:36:38 crc kubenswrapper[4830]: I0311 09:36:38.780090 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerStarted","Data":"0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0"} Mar 11 09:36:38 crc kubenswrapper[4830]: I0311 09:36:38.780444 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerStarted","Data":"a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe"} Mar 11 09:36:40 crc kubenswrapper[4830]: I0311 09:36:40.818285 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerStarted","Data":"17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182"} Mar 11 09:36:40 crc kubenswrapper[4830]: I0311 09:36:40.818854 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 09:36:41 crc kubenswrapper[4830]: I0311 09:36:41.867063 4830 generic.go:334] "Generic (PLEG): container finished" podID="8f0cbfba-9a9b-43cd-8d56-b500764edebd" containerID="db59682059ad5d3852437c009d1ec91a3b7d8c392986953cfd586a2f8968ab27" exitCode=0 Mar 11 09:36:41 crc kubenswrapper[4830]: I0311 09:36:41.867158 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" event={"ID":"8f0cbfba-9a9b-43cd-8d56-b500764edebd","Type":"ContainerDied","Data":"db59682059ad5d3852437c009d1ec91a3b7d8c392986953cfd586a2f8968ab27"} Mar 11 09:36:41 crc kubenswrapper[4830]: I0311 09:36:41.887894 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.387785381 podStartE2EDuration="7.887874577s" podCreationTimestamp="2026-03-11 09:36:34 +0000 UTC" firstStartedPulling="2026-03-11 09:36:35.61686529 +0000 UTC m=+1363.398015969" lastFinishedPulling="2026-03-11 09:36:40.116954466 +0000 UTC m=+1367.898105165" observedRunningTime="2026-03-11 09:36:40.845119443 +0000 UTC m=+1368.626270152" watchObservedRunningTime="2026-03-11 09:36:41.887874577 +0000 UTC m=+1369.669025276" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.066374 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.066760 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.232369 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.293176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-scripts\") pod \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.293329 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-combined-ca-bundle\") pod \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.293385 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68cgr\" (UniqueName: \"kubernetes.io/projected/8f0cbfba-9a9b-43cd-8d56-b500764edebd-kube-api-access-68cgr\") pod \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.293484 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-config-data\") pod \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\" (UID: \"8f0cbfba-9a9b-43cd-8d56-b500764edebd\") " Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.298692 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0cbfba-9a9b-43cd-8d56-b500764edebd-kube-api-access-68cgr" (OuterVolumeSpecName: "kube-api-access-68cgr") pod "8f0cbfba-9a9b-43cd-8d56-b500764edebd" (UID: "8f0cbfba-9a9b-43cd-8d56-b500764edebd"). InnerVolumeSpecName "kube-api-access-68cgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.298812 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-scripts" (OuterVolumeSpecName: "scripts") pod "8f0cbfba-9a9b-43cd-8d56-b500764edebd" (UID: "8f0cbfba-9a9b-43cd-8d56-b500764edebd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.322610 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-config-data" (OuterVolumeSpecName: "config-data") pod "8f0cbfba-9a9b-43cd-8d56-b500764edebd" (UID: "8f0cbfba-9a9b-43cd-8d56-b500764edebd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.324533 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f0cbfba-9a9b-43cd-8d56-b500764edebd" (UID: "8f0cbfba-9a9b-43cd-8d56-b500764edebd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.396167 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68cgr\" (UniqueName: \"kubernetes.io/projected/8f0cbfba-9a9b-43cd-8d56-b500764edebd-kube-api-access-68cgr\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.396205 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.396215 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.396224 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f0cbfba-9a9b-43cd-8d56-b500764edebd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.884494 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" event={"ID":"8f0cbfba-9a9b-43cd-8d56-b500764edebd","Type":"ContainerDied","Data":"d846ebcf6e143921cf84dbc1fdbae942e2ec2ce82a3ba124924fa82a28d5b554"} Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.884828 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d846ebcf6e143921cf84dbc1fdbae942e2ec2ce82a3ba124924fa82a28d5b554" Mar 11 09:36:43 crc kubenswrapper[4830]: I0311 09:36:43.884893 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmx2j" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.047130 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 09:36:44 crc kubenswrapper[4830]: E0311 09:36:44.047590 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f0cbfba-9a9b-43cd-8d56-b500764edebd" containerName="nova-cell0-conductor-db-sync" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.047613 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0cbfba-9a9b-43cd-8d56-b500764edebd" containerName="nova-cell0-conductor-db-sync" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.047780 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f0cbfba-9a9b-43cd-8d56-b500764edebd" containerName="nova-cell0-conductor-db-sync" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.048399 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.050467 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.051306 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vd559" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.065132 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.109842 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc74c44-bbbb-4dd9-b762-f7c483d0e336-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.109925 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpzrr\" (UniqueName: \"kubernetes.io/projected/efc74c44-bbbb-4dd9-b762-f7c483d0e336-kube-api-access-gpzrr\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.109964 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc74c44-bbbb-4dd9-b762-f7c483d0e336-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.211685 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc74c44-bbbb-4dd9-b762-f7c483d0e336-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.211773 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpzrr\" (UniqueName: \"kubernetes.io/projected/efc74c44-bbbb-4dd9-b762-f7c483d0e336-kube-api-access-gpzrr\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.211808 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc74c44-bbbb-4dd9-b762-f7c483d0e336-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.216851 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc74c44-bbbb-4dd9-b762-f7c483d0e336-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.220539 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc74c44-bbbb-4dd9-b762-f7c483d0e336-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.241547 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpzrr\" (UniqueName: \"kubernetes.io/projected/efc74c44-bbbb-4dd9-b762-f7c483d0e336-kube-api-access-gpzrr\") pod \"nova-cell0-conductor-0\" (UID: \"efc74c44-bbbb-4dd9-b762-f7c483d0e336\") " pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.369731 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.804926 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 11 09:36:44 crc kubenswrapper[4830]: I0311 09:36:44.897352 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"efc74c44-bbbb-4dd9-b762-f7c483d0e336","Type":"ContainerStarted","Data":"d96f4187e9fa43cc83cdf8ef5d2414b770fb4bac4f8b9fe0f85d665115691379"} Mar 11 09:36:45 crc kubenswrapper[4830]: I0311 09:36:45.906529 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"efc74c44-bbbb-4dd9-b762-f7c483d0e336","Type":"ContainerStarted","Data":"a80e870d771a9cfcdceeec35474e132f2d71128086819db06432cb5af8c740ee"} Mar 11 09:36:45 crc kubenswrapper[4830]: I0311 09:36:45.906860 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:45 crc kubenswrapper[4830]: I0311 09:36:45.925523 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.925504278 podStartE2EDuration="1.925504278s" podCreationTimestamp="2026-03-11 09:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:36:45.920567921 +0000 UTC m=+1373.701718630" watchObservedRunningTime="2026-03-11 09:36:45.925504278 +0000 UTC m=+1373.706654967" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.396120 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.872932 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-67js2"] Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.874515 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.877965 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.878546 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.881517 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-67js2"] Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.928121 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-scripts\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.928176 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.928218 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4zc\" (UniqueName: \"kubernetes.io/projected/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-kube-api-access-hl4zc\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:54 crc kubenswrapper[4830]: I0311 09:36:54.928342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-config-data\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.030359 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-scripts\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.030415 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.030456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4zc\" (UniqueName: \"kubernetes.io/projected/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-kube-api-access-hl4zc\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.030501 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-config-data\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.036931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-scripts\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.039531 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.043777 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-config-data\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.080085 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.081900 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.083837 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4zc\" (UniqueName: \"kubernetes.io/projected/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-kube-api-access-hl4zc\") pod \"nova-cell0-cell-mapping-67js2\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.087335 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.095508 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.096618 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.099080 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.117341 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.134983 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.135042 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhq4\" (UniqueName: \"kubernetes.io/projected/d9c21203-bcf1-474e-abf5-18f188e0553a-kube-api-access-ndhq4\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.135073 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-config-data\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.135265 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlmk8\" (UniqueName: \"kubernetes.io/projected/52008989-6ec1-41f8-99b6-7daeb3591033-kube-api-access-hlmk8\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.135409 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.135482 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c21203-bcf1-474e-abf5-18f188e0553a-logs\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.135609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.141101 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.209033 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.238553 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.238621 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhq4\" (UniqueName: \"kubernetes.io/projected/d9c21203-bcf1-474e-abf5-18f188e0553a-kube-api-access-ndhq4\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.238656 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-config-data\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.238772 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlmk8\" (UniqueName: \"kubernetes.io/projected/52008989-6ec1-41f8-99b6-7daeb3591033-kube-api-access-hlmk8\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.238856 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.238917 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c21203-bcf1-474e-abf5-18f188e0553a-logs\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.238975 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.241122 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c21203-bcf1-474e-abf5-18f188e0553a-logs\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.247521 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-config-data\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.252217 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.253969 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.254820 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.257584 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.260388 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.260854 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.275346 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlmk8\" (UniqueName: \"kubernetes.io/projected/52008989-6ec1-41f8-99b6-7daeb3591033-kube-api-access-hlmk8\") pod \"nova-cell1-novncproxy-0\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.275737 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhq4\" (UniqueName: \"kubernetes.io/projected/d9c21203-bcf1-474e-abf5-18f188e0553a-kube-api-access-ndhq4\") pod \"nova-api-0\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.342149 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.342256 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-config-data\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.342340 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d294\" (UniqueName: \"kubernetes.io/projected/1b096376-c676-4be7-8072-6ca236669969-kube-api-access-5d294\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.342411 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b096376-c676-4be7-8072-6ca236669969-logs\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.348631 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.444662 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-config-data\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.444840 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d294\" (UniqueName: \"kubernetes.io/projected/1b096376-c676-4be7-8072-6ca236669969-kube-api-access-5d294\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.444929 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b096376-c676-4be7-8072-6ca236669969-logs\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.444979 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.453094 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.453316 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b096376-c676-4be7-8072-6ca236669969-logs\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.454214 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.457880 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.463193 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.464306 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-config-data\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.473477 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d294\" (UniqueName: \"kubernetes.io/projected/1b096376-c676-4be7-8072-6ca236669969-kube-api-access-5d294\") pod \"nova-metadata-0\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.489934 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.501509 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.512049 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.547086 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.547422 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mw7t\" (UniqueName: \"kubernetes.io/projected/c70fe727-3f0d-4884-8ec6-c841f019b453-kube-api-access-9mw7t\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.547514 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-config-data\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.557075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-v79d9"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.558670 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.613467 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-v79d9"] Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.650894 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-config-data\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.650986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.651045 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-svc\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.651093 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.651123 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9m2s\" (UniqueName: \"kubernetes.io/projected/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-kube-api-access-f9m2s\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.651145 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.651170 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-config\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.651193 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.651232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mw7t\" (UniqueName: \"kubernetes.io/projected/c70fe727-3f0d-4884-8ec6-c841f019b453-kube-api-access-9mw7t\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.672419 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.682174 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-config-data\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.682757 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.691152 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mw7t\" (UniqueName: \"kubernetes.io/projected/c70fe727-3f0d-4884-8ec6-c841f019b453-kube-api-access-9mw7t\") pod \"nova-scheduler-0\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.763820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.764092 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-svc\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.764204 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9m2s\" (UniqueName: \"kubernetes.io/projected/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-kube-api-access-f9m2s\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.764306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.764438 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-config\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.764521 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.764927 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-swift-storage-0\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.765490 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-svc\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.766571 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-sb\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.771450 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-config\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.771776 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-nb\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.787860 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9m2s\" (UniqueName: \"kubernetes.io/projected/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-kube-api-access-f9m2s\") pod \"dnsmasq-dns-865f5d856f-v79d9\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.843811 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:36:55 crc kubenswrapper[4830]: I0311 09:36:55.933468 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.005146 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-67js2"] Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.190137 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.214080 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.355497 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.414488 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d9ntp"] Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.415701 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.417859 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.418198 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.430253 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d9ntp"] Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.449075 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.486423 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwbf\" (UniqueName: \"kubernetes.io/projected/5849038f-38d8-48c8-a4d8-70dc0166cdf9-kube-api-access-qpwbf\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.486679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-scripts\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.486759 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.487097 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-config-data\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.534690 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-v79d9"] Mar 11 09:36:56 crc kubenswrapper[4830]: W0311 09:36:56.547607 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c0d3f7_fb22_458f_a74d_a4fb397d3d68.slice/crio-dff143a38561dfe9da81146945b3600213fd20877f9524eb9b81730cb0a6648f WatchSource:0}: Error finding container dff143a38561dfe9da81146945b3600213fd20877f9524eb9b81730cb0a6648f: Status 404 returned error can't find the container with id dff143a38561dfe9da81146945b3600213fd20877f9524eb9b81730cb0a6648f Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.589337 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-scripts\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.589395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.589454 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-config-data\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.589483 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwbf\" (UniqueName: \"kubernetes.io/projected/5849038f-38d8-48c8-a4d8-70dc0166cdf9-kube-api-access-qpwbf\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.593686 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-scripts\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.593729 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.593798 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-config-data\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.605750 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwbf\" (UniqueName: \"kubernetes.io/projected/5849038f-38d8-48c8-a4d8-70dc0166cdf9-kube-api-access-qpwbf\") pod \"nova-cell1-conductor-db-sync-d9ntp\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:56 crc kubenswrapper[4830]: I0311 09:36:56.739351 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.019374 4830 generic.go:334] "Generic (PLEG): container finished" podID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerID="7b1c25d3125d8e397653d1b870709aeb71fc9b83e76f2ddf62567f90c902d105" exitCode=0 Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.019686 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" event={"ID":"46c0d3f7-fb22-458f-a74d-a4fb397d3d68","Type":"ContainerDied","Data":"7b1c25d3125d8e397653d1b870709aeb71fc9b83e76f2ddf62567f90c902d105"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.019710 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" event={"ID":"46c0d3f7-fb22-458f-a74d-a4fb397d3d68","Type":"ContainerStarted","Data":"dff143a38561dfe9da81146945b3600213fd20877f9524eb9b81730cb0a6648f"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.026285 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b096376-c676-4be7-8072-6ca236669969","Type":"ContainerStarted","Data":"71938a240528c472bf8fb58205d977aaf6b1db10d1a6ec149f4a8e841db16fee"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.027741 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52008989-6ec1-41f8-99b6-7daeb3591033","Type":"ContainerStarted","Data":"d44fb77c65da46d64bbbc6c15d2e704f6f4d3d58d391e6b127c1c3b91f1ab781"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.030518 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-67js2" event={"ID":"7a498e09-d7b4-4af2-bb1a-dff67b8ce005","Type":"ContainerStarted","Data":"5473ea6d9d3da892e42d77657dcb21aa2384b3204229d921b89cd169b0193028"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.030562 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-67js2" event={"ID":"7a498e09-d7b4-4af2-bb1a-dff67b8ce005","Type":"ContainerStarted","Data":"ca847a04151483261ee07832009f09dfb9a35bbdb819c6547a8f4fd7802aee25"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.032579 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70fe727-3f0d-4884-8ec6-c841f019b453","Type":"ContainerStarted","Data":"bd1a29b6d5c2ef1aa533f8bdc362d399056e5d09574c82eb308cc4d26497b300"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.038492 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9c21203-bcf1-474e-abf5-18f188e0553a","Type":"ContainerStarted","Data":"5a6bea82a5d8cfa88878064fbd299b17cd86912774c298b17a6f8cbe16a91bff"} Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.061135 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-67js2" podStartSLOduration=3.061109419 podStartE2EDuration="3.061109419s" podCreationTimestamp="2026-03-11 09:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:36:57.056571823 +0000 UTC m=+1384.837722532" watchObservedRunningTime="2026-03-11 09:36:57.061109419 +0000 UTC m=+1384.842260128" Mar 11 09:36:57 crc kubenswrapper[4830]: I0311 09:36:57.209585 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d9ntp"] Mar 11 09:36:57 crc kubenswrapper[4830]: W0311 09:36:57.230247 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5849038f_38d8_48c8_a4d8_70dc0166cdf9.slice/crio-9e88d9f98f070feac97e35a343325bd5df8a687ff7f6d2c0f0a9e65364bb4955 WatchSource:0}: Error finding container 9e88d9f98f070feac97e35a343325bd5df8a687ff7f6d2c0f0a9e65364bb4955: Status 404 returned error can't find the container with id 9e88d9f98f070feac97e35a343325bd5df8a687ff7f6d2c0f0a9e65364bb4955 Mar 11 09:36:58 crc kubenswrapper[4830]: I0311 09:36:58.047928 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" event={"ID":"5849038f-38d8-48c8-a4d8-70dc0166cdf9","Type":"ContainerStarted","Data":"c24780ade9f15d78f78d1861d632e31d983b213a8e04fc73edb39ca91e6b403a"} Mar 11 09:36:58 crc kubenswrapper[4830]: I0311 09:36:58.048321 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" event={"ID":"5849038f-38d8-48c8-a4d8-70dc0166cdf9","Type":"ContainerStarted","Data":"9e88d9f98f070feac97e35a343325bd5df8a687ff7f6d2c0f0a9e65364bb4955"} Mar 11 09:36:58 crc kubenswrapper[4830]: I0311 09:36:58.049885 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" event={"ID":"46c0d3f7-fb22-458f-a74d-a4fb397d3d68","Type":"ContainerStarted","Data":"acb320b4280fed5ec1050ed40d9719952f4b6f7144d5345114350863442fb9d4"} Mar 11 09:36:58 crc kubenswrapper[4830]: I0311 09:36:58.050114 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:36:58 crc kubenswrapper[4830]: I0311 09:36:58.071726 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" podStartSLOduration=2.071708173 podStartE2EDuration="2.071708173s" podCreationTimestamp="2026-03-11 09:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:36:58.065537071 +0000 UTC m=+1385.846687760" watchObservedRunningTime="2026-03-11 09:36:58.071708173 +0000 UTC m=+1385.852858862" Mar 11 09:36:58 crc kubenswrapper[4830]: I0311 09:36:58.099623 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" podStartSLOduration=3.099605486 podStartE2EDuration="3.099605486s" podCreationTimestamp="2026-03-11 09:36:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:36:58.094571806 +0000 UTC m=+1385.875722515" watchObservedRunningTime="2026-03-11 09:36:58.099605486 +0000 UTC m=+1385.880756175" Mar 11 09:36:59 crc kubenswrapper[4830]: I0311 09:36:59.141948 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:36:59 crc kubenswrapper[4830]: I0311 09:36:59.153268 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:37:01 crc kubenswrapper[4830]: I0311 09:37:01.090905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9c21203-bcf1-474e-abf5-18f188e0553a","Type":"ContainerStarted","Data":"539360b564f81f0d337c6d6f03c266ad668f86480266c5cc6f96ab141e110887"} Mar 11 09:37:01 crc kubenswrapper[4830]: I0311 09:37:01.094744 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b096376-c676-4be7-8072-6ca236669969","Type":"ContainerStarted","Data":"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116"} Mar 11 09:37:01 crc kubenswrapper[4830]: I0311 09:37:01.097012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52008989-6ec1-41f8-99b6-7daeb3591033","Type":"ContainerStarted","Data":"9f7699a8975d2876363c67476e644a5ef45ba1160b67faaad9196da93584601f"} Mar 11 09:37:01 crc kubenswrapper[4830]: I0311 09:37:01.097121 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="52008989-6ec1-41f8-99b6-7daeb3591033" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9f7699a8975d2876363c67476e644a5ef45ba1160b67faaad9196da93584601f" gracePeriod=30 Mar 11 09:37:01 crc kubenswrapper[4830]: I0311 09:37:01.100290 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70fe727-3f0d-4884-8ec6-c841f019b453","Type":"ContainerStarted","Data":"d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba"} Mar 11 09:37:01 crc kubenswrapper[4830]: I0311 09:37:01.127746 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.79627281 podStartE2EDuration="6.127724083s" podCreationTimestamp="2026-03-11 09:36:55 +0000 UTC" firstStartedPulling="2026-03-11 09:36:56.212620028 +0000 UTC m=+1383.993770737" lastFinishedPulling="2026-03-11 09:37:00.544071321 +0000 UTC m=+1388.325222010" observedRunningTime="2026-03-11 09:37:01.126851879 +0000 UTC m=+1388.908002598" watchObservedRunningTime="2026-03-11 09:37:01.127724083 +0000 UTC m=+1388.908874772" Mar 11 09:37:01 crc kubenswrapper[4830]: I0311 09:37:01.152276 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.048249123 podStartE2EDuration="6.152252413s" podCreationTimestamp="2026-03-11 09:36:55 +0000 UTC" firstStartedPulling="2026-03-11 09:36:56.440140133 +0000 UTC m=+1384.221290822" lastFinishedPulling="2026-03-11 09:37:00.544143433 +0000 UTC m=+1388.325294112" observedRunningTime="2026-03-11 09:37:01.142573575 +0000 UTC m=+1388.923724284" watchObservedRunningTime="2026-03-11 09:37:01.152252413 +0000 UTC m=+1388.933403102" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.115072 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b096376-c676-4be7-8072-6ca236669969","Type":"ContainerStarted","Data":"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a"} Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.115365 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-metadata" containerID="cri-o://55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a" gracePeriod=30 Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.115264 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-log" containerID="cri-o://4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116" gracePeriod=30 Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.123512 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9c21203-bcf1-474e-abf5-18f188e0553a","Type":"ContainerStarted","Data":"6b8eb712f98e7f7c153f5ee3fe37d54c8457ea4ddff207f832dde3bb84ecaa9e"} Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.147865 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.970352514 podStartE2EDuration="7.14784414s" podCreationTimestamp="2026-03-11 09:36:55 +0000 UTC" firstStartedPulling="2026-03-11 09:36:56.369184176 +0000 UTC m=+1384.150334875" lastFinishedPulling="2026-03-11 09:37:00.546675812 +0000 UTC m=+1388.327826501" observedRunningTime="2026-03-11 09:37:02.146152733 +0000 UTC m=+1389.927303462" watchObservedRunningTime="2026-03-11 09:37:02.14784414 +0000 UTC m=+1389.928994829" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.184395 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.852533508 podStartE2EDuration="7.184372522s" podCreationTimestamp="2026-03-11 09:36:55 +0000 UTC" firstStartedPulling="2026-03-11 09:36:56.212239257 +0000 UTC m=+1383.993389946" lastFinishedPulling="2026-03-11 09:37:00.544078271 +0000 UTC m=+1388.325228960" observedRunningTime="2026-03-11 09:37:02.175220879 +0000 UTC m=+1389.956371568" watchObservedRunningTime="2026-03-11 09:37:02.184372522 +0000 UTC m=+1389.965523211" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.731177 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.825165 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b096376-c676-4be7-8072-6ca236669969-logs\") pod \"1b096376-c676-4be7-8072-6ca236669969\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.825266 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-combined-ca-bundle\") pod \"1b096376-c676-4be7-8072-6ca236669969\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.825402 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d294\" (UniqueName: \"kubernetes.io/projected/1b096376-c676-4be7-8072-6ca236669969-kube-api-access-5d294\") pod \"1b096376-c676-4be7-8072-6ca236669969\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.825422 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-config-data\") pod \"1b096376-c676-4be7-8072-6ca236669969\" (UID: \"1b096376-c676-4be7-8072-6ca236669969\") " Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.825565 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b096376-c676-4be7-8072-6ca236669969-logs" (OuterVolumeSpecName: "logs") pod "1b096376-c676-4be7-8072-6ca236669969" (UID: "1b096376-c676-4be7-8072-6ca236669969"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.826869 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b096376-c676-4be7-8072-6ca236669969-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.831053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b096376-c676-4be7-8072-6ca236669969-kube-api-access-5d294" (OuterVolumeSpecName: "kube-api-access-5d294") pod "1b096376-c676-4be7-8072-6ca236669969" (UID: "1b096376-c676-4be7-8072-6ca236669969"). InnerVolumeSpecName "kube-api-access-5d294". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.857054 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-config-data" (OuterVolumeSpecName: "config-data") pod "1b096376-c676-4be7-8072-6ca236669969" (UID: "1b096376-c676-4be7-8072-6ca236669969"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.858414 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b096376-c676-4be7-8072-6ca236669969" (UID: "1b096376-c676-4be7-8072-6ca236669969"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.928725 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d294\" (UniqueName: \"kubernetes.io/projected/1b096376-c676-4be7-8072-6ca236669969-kube-api-access-5d294\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.929039 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:02 crc kubenswrapper[4830]: I0311 09:37:02.929108 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b096376-c676-4be7-8072-6ca236669969-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.132336 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b096376-c676-4be7-8072-6ca236669969" containerID="55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a" exitCode=0 Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.132373 4830 generic.go:334] "Generic (PLEG): container finished" podID="1b096376-c676-4be7-8072-6ca236669969" containerID="4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116" exitCode=143 Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.132430 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b096376-c676-4be7-8072-6ca236669969","Type":"ContainerDied","Data":"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a"} Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.132476 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b096376-c676-4be7-8072-6ca236669969","Type":"ContainerDied","Data":"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116"} Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.132495 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1b096376-c676-4be7-8072-6ca236669969","Type":"ContainerDied","Data":"71938a240528c472bf8fb58205d977aaf6b1db10d1a6ec149f4a8e841db16fee"} Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.132516 4830 scope.go:117] "RemoveContainer" containerID="55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.133411 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.158885 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.169704 4830 scope.go:117] "RemoveContainer" containerID="4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.175349 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.189498 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:03 crc kubenswrapper[4830]: E0311 09:37:03.189910 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-log" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.189927 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-log" Mar 11 09:37:03 crc kubenswrapper[4830]: E0311 09:37:03.189940 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-metadata" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.189946 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-metadata" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.201130 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-log" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.201168 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b096376-c676-4be7-8072-6ca236669969" containerName="nova-metadata-metadata" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.202353 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.202676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.205086 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.205262 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.214395 4830 scope.go:117] "RemoveContainer" containerID="55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a" Mar 11 09:37:03 crc kubenswrapper[4830]: E0311 09:37:03.217454 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a\": container with ID starting with 55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a not found: ID does not exist" containerID="55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.217513 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a"} err="failed to get container status \"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a\": rpc error: code = NotFound desc = could not find container \"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a\": container with ID starting with 55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a not found: ID does not exist" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.217543 4830 scope.go:117] "RemoveContainer" containerID="4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116" Mar 11 09:37:03 crc kubenswrapper[4830]: E0311 09:37:03.218694 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116\": container with ID starting with 4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116 not found: ID does not exist" containerID="4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.218731 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116"} err="failed to get container status \"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116\": rpc error: code = NotFound desc = could not find container \"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116\": container with ID starting with 4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116 not found: ID does not exist" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.218785 4830 scope.go:117] "RemoveContainer" containerID="55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.221781 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a"} err="failed to get container status \"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a\": rpc error: code = NotFound desc = could not find container \"55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a\": container with ID starting with 55a69568dfe674a3319c5b7bdf79049aa1cab33ea1ebc49a6fd53df7deb39a7a not found: ID does not exist" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.221817 4830 scope.go:117] "RemoveContainer" containerID="4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.222139 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116"} err="failed to get container status \"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116\": rpc error: code = NotFound desc = could not find container \"4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116\": container with ID starting with 4a71e4cb82515b04e528150d1e9c05cfc1bb8dbb06f8bb3f1376d008e30b2116 not found: ID does not exist" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.233968 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.234044 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzskn\" (UniqueName: \"kubernetes.io/projected/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-kube-api-access-jzskn\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.234087 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-config-data\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.234697 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-logs\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.234812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.336535 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.336687 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.336734 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzskn\" (UniqueName: \"kubernetes.io/projected/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-kube-api-access-jzskn\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.336766 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-config-data\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.336795 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-logs\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.337366 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-logs\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.341114 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.341138 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-config-data\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.345838 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.353297 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzskn\" (UniqueName: \"kubernetes.io/projected/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-kube-api-access-jzskn\") pod \"nova-metadata-0\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.522395 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:03 crc kubenswrapper[4830]: I0311 09:37:03.996993 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:04 crc kubenswrapper[4830]: I0311 09:37:04.159391 4830 generic.go:334] "Generic (PLEG): container finished" podID="7a498e09-d7b4-4af2-bb1a-dff67b8ce005" containerID="5473ea6d9d3da892e42d77657dcb21aa2384b3204229d921b89cd169b0193028" exitCode=0 Mar 11 09:37:04 crc kubenswrapper[4830]: I0311 09:37:04.159460 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-67js2" event={"ID":"7a498e09-d7b4-4af2-bb1a-dff67b8ce005","Type":"ContainerDied","Data":"5473ea6d9d3da892e42d77657dcb21aa2384b3204229d921b89cd169b0193028"} Mar 11 09:37:04 crc kubenswrapper[4830]: I0311 09:37:04.160718 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"471cb0dc-46e4-4ac5-b55a-dcb053cacb31","Type":"ContainerStarted","Data":"fbaa6dbc96a9a144c1afcc0ec633ea9f66f4eb9575c397d999fcabc6aa03bb63"} Mar 11 09:37:04 crc kubenswrapper[4830]: I0311 09:37:04.942835 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b096376-c676-4be7-8072-6ca236669969" path="/var/lib/kubelet/pods/1b096376-c676-4be7-8072-6ca236669969/volumes" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.173623 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"471cb0dc-46e4-4ac5-b55a-dcb053cacb31","Type":"ContainerStarted","Data":"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea"} Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.173662 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"471cb0dc-46e4-4ac5-b55a-dcb053cacb31","Type":"ContainerStarted","Data":"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a"} Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.174554 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.213372 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.213353483 podStartE2EDuration="2.213353483s" podCreationTimestamp="2026-03-11 09:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:05.194961884 +0000 UTC m=+1392.976112583" watchObservedRunningTime="2026-03-11 09:37:05.213353483 +0000 UTC m=+1392.994504172" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.490960 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.491099 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.505177 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.561708 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.596262 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-config-data\") pod \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.596332 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-scripts\") pod \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.596387 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl4zc\" (UniqueName: \"kubernetes.io/projected/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-kube-api-access-hl4zc\") pod \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.596528 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-combined-ca-bundle\") pod \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\" (UID: \"7a498e09-d7b4-4af2-bb1a-dff67b8ce005\") " Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.602345 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-scripts" (OuterVolumeSpecName: "scripts") pod "7a498e09-d7b4-4af2-bb1a-dff67b8ce005" (UID: "7a498e09-d7b4-4af2-bb1a-dff67b8ce005"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.602538 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-kube-api-access-hl4zc" (OuterVolumeSpecName: "kube-api-access-hl4zc") pod "7a498e09-d7b4-4af2-bb1a-dff67b8ce005" (UID: "7a498e09-d7b4-4af2-bb1a-dff67b8ce005"). InnerVolumeSpecName "kube-api-access-hl4zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.625219 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a498e09-d7b4-4af2-bb1a-dff67b8ce005" (UID: "7a498e09-d7b4-4af2-bb1a-dff67b8ce005"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.631942 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-config-data" (OuterVolumeSpecName: "config-data") pod "7a498e09-d7b4-4af2-bb1a-dff67b8ce005" (UID: "7a498e09-d7b4-4af2-bb1a-dff67b8ce005"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.699198 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.699230 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.699239 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl4zc\" (UniqueName: \"kubernetes.io/projected/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-kube-api-access-hl4zc\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.699251 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a498e09-d7b4-4af2-bb1a-dff67b8ce005-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.845222 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.845261 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.875284 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.935187 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.994380 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-2xcmq"] Mar 11 09:37:05 crc kubenswrapper[4830]: I0311 09:37:05.994634 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" podUID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerName="dnsmasq-dns" containerID="cri-o://f45b5be0bef291446fd5d3bd7d090b740138b75f83e037bab72b907404b42ef6" gracePeriod=10 Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.188128 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-67js2" event={"ID":"7a498e09-d7b4-4af2-bb1a-dff67b8ce005","Type":"ContainerDied","Data":"ca847a04151483261ee07832009f09dfb9a35bbdb819c6547a8f4fd7802aee25"} Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.188170 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca847a04151483261ee07832009f09dfb9a35bbdb819c6547a8f4fd7802aee25" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.188229 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-67js2" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.201510 4830 generic.go:334] "Generic (PLEG): container finished" podID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerID="f45b5be0bef291446fd5d3bd7d090b740138b75f83e037bab72b907404b42ef6" exitCode=0 Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.201604 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" event={"ID":"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7","Type":"ContainerDied","Data":"f45b5be0bef291446fd5d3bd7d090b740138b75f83e037bab72b907404b42ef6"} Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.203770 4830 generic.go:334] "Generic (PLEG): container finished" podID="5849038f-38d8-48c8-a4d8-70dc0166cdf9" containerID="c24780ade9f15d78f78d1861d632e31d983b213a8e04fc73edb39ca91e6b403a" exitCode=0 Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.204973 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" event={"ID":"5849038f-38d8-48c8-a4d8-70dc0166cdf9","Type":"ContainerDied","Data":"c24780ade9f15d78f78d1861d632e31d983b213a8e04fc73edb39ca91e6b403a"} Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.253641 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.446629 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.446826 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-log" containerID="cri-o://539360b564f81f0d337c6d6f03c266ad668f86480266c5cc6f96ab141e110887" gracePeriod=30 Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.447228 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-api" containerID="cri-o://6b8eb712f98e7f7c153f5ee3fe37d54c8457ea4ddff207f832dde3bb84ecaa9e" gracePeriod=30 Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.454101 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": EOF" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.455324 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": EOF" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.468782 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.527954 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.621358 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-sb\") pod \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.621435 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-svc\") pod \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.621464 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-swift-storage-0\") pod \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.621488 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-nb\") pod \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.621675 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4d5b\" (UniqueName: \"kubernetes.io/projected/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-kube-api-access-h4d5b\") pod \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.621713 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-config\") pod \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\" (UID: \"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7\") " Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.636236 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-kube-api-access-h4d5b" (OuterVolumeSpecName: "kube-api-access-h4d5b") pod "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" (UID: "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7"). InnerVolumeSpecName "kube-api-access-h4d5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.675404 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-config" (OuterVolumeSpecName: "config") pod "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" (UID: "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.685754 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" (UID: "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.686289 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" (UID: "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.689511 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" (UID: "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.692090 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" (UID: "f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.723992 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.724064 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.724075 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.724083 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.724093 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4d5b\" (UniqueName: \"kubernetes.io/projected/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-kube-api-access-h4d5b\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.724103 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:06 crc kubenswrapper[4830]: I0311 09:37:06.752064 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.189732 4830 scope.go:117] "RemoveContainer" containerID="1082e88df298d5491c45b4e8c99372bf2065e69c892cb2ae2af60456ef07cb32" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.217085 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" event={"ID":"f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7","Type":"ContainerDied","Data":"7586a33145077135f1215537f24c81e3374ebc0285a7e180064da195b29389d5"} Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.217122 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-2xcmq" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.217150 4830 scope.go:117] "RemoveContainer" containerID="f45b5be0bef291446fd5d3bd7d090b740138b75f83e037bab72b907404b42ef6" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.224714 4830 generic.go:334] "Generic (PLEG): container finished" podID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerID="539360b564f81f0d337c6d6f03c266ad668f86480266c5cc6f96ab141e110887" exitCode=143 Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.224845 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9c21203-bcf1-474e-abf5-18f188e0553a","Type":"ContainerDied","Data":"539360b564f81f0d337c6d6f03c266ad668f86480266c5cc6f96ab141e110887"} Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.224915 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-log" containerID="cri-o://738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a" gracePeriod=30 Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.225069 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-metadata" containerID="cri-o://e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea" gracePeriod=30 Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.333433 4830 scope.go:117] "RemoveContainer" containerID="de8f478e3d286ad249d1f4c3460d16d8d1a5e2a07f31ff7c5562f44099f0a087" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.336937 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-2xcmq"] Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.373511 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-2xcmq"] Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.566903 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.678761 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpwbf\" (UniqueName: \"kubernetes.io/projected/5849038f-38d8-48c8-a4d8-70dc0166cdf9-kube-api-access-qpwbf\") pod \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.678847 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-scripts\") pod \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.679041 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-config-data\") pod \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.679085 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-combined-ca-bundle\") pod \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\" (UID: \"5849038f-38d8-48c8-a4d8-70dc0166cdf9\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.689255 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5849038f-38d8-48c8-a4d8-70dc0166cdf9-kube-api-access-qpwbf" (OuterVolumeSpecName: "kube-api-access-qpwbf") pod "5849038f-38d8-48c8-a4d8-70dc0166cdf9" (UID: "5849038f-38d8-48c8-a4d8-70dc0166cdf9"). InnerVolumeSpecName "kube-api-access-qpwbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.691547 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-scripts" (OuterVolumeSpecName: "scripts") pod "5849038f-38d8-48c8-a4d8-70dc0166cdf9" (UID: "5849038f-38d8-48c8-a4d8-70dc0166cdf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.747905 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5849038f-38d8-48c8-a4d8-70dc0166cdf9" (UID: "5849038f-38d8-48c8-a4d8-70dc0166cdf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.757336 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-config-data" (OuterVolumeSpecName: "config-data") pod "5849038f-38d8-48c8-a4d8-70dc0166cdf9" (UID: "5849038f-38d8-48c8-a4d8-70dc0166cdf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.781767 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpwbf\" (UniqueName: \"kubernetes.io/projected/5849038f-38d8-48c8-a4d8-70dc0166cdf9-kube-api-access-qpwbf\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.781811 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.781824 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.781837 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5849038f-38d8-48c8-a4d8-70dc0166cdf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.855730 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.986421 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-combined-ca-bundle\") pod \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.986536 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-nova-metadata-tls-certs\") pod \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.986584 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-config-data\") pod \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.986655 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzskn\" (UniqueName: \"kubernetes.io/projected/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-kube-api-access-jzskn\") pod \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.986731 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-logs\") pod \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\" (UID: \"471cb0dc-46e4-4ac5-b55a-dcb053cacb31\") " Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.988405 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-logs" (OuterVolumeSpecName: "logs") pod "471cb0dc-46e4-4ac5-b55a-dcb053cacb31" (UID: "471cb0dc-46e4-4ac5-b55a-dcb053cacb31"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:07 crc kubenswrapper[4830]: I0311 09:37:07.999221 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-kube-api-access-jzskn" (OuterVolumeSpecName: "kube-api-access-jzskn") pod "471cb0dc-46e4-4ac5-b55a-dcb053cacb31" (UID: "471cb0dc-46e4-4ac5-b55a-dcb053cacb31"). InnerVolumeSpecName "kube-api-access-jzskn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.020319 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-config-data" (OuterVolumeSpecName: "config-data") pod "471cb0dc-46e4-4ac5-b55a-dcb053cacb31" (UID: "471cb0dc-46e4-4ac5-b55a-dcb053cacb31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.022149 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "471cb0dc-46e4-4ac5-b55a-dcb053cacb31" (UID: "471cb0dc-46e4-4ac5-b55a-dcb053cacb31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.044446 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "471cb0dc-46e4-4ac5-b55a-dcb053cacb31" (UID: "471cb0dc-46e4-4ac5-b55a-dcb053cacb31"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.088793 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.088831 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.088843 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.088851 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzskn\" (UniqueName: \"kubernetes.io/projected/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-kube-api-access-jzskn\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.088860 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/471cb0dc-46e4-4ac5-b55a-dcb053cacb31-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.235151 4830 generic.go:334] "Generic (PLEG): container finished" podID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerID="e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea" exitCode=0 Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.235181 4830 generic.go:334] "Generic (PLEG): container finished" podID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerID="738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a" exitCode=143 Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.235208 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.235235 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"471cb0dc-46e4-4ac5-b55a-dcb053cacb31","Type":"ContainerDied","Data":"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea"} Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.235277 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"471cb0dc-46e4-4ac5-b55a-dcb053cacb31","Type":"ContainerDied","Data":"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a"} Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.235288 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"471cb0dc-46e4-4ac5-b55a-dcb053cacb31","Type":"ContainerDied","Data":"fbaa6dbc96a9a144c1afcc0ec633ea9f66f4eb9575c397d999fcabc6aa03bb63"} Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.235306 4830 scope.go:117] "RemoveContainer" containerID="e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.242041 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c70fe727-3f0d-4884-8ec6-c841f019b453" containerName="nova-scheduler-scheduler" containerID="cri-o://d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba" gracePeriod=30 Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.242388 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.242590 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d9ntp" event={"ID":"5849038f-38d8-48c8-a4d8-70dc0166cdf9","Type":"ContainerDied","Data":"9e88d9f98f070feac97e35a343325bd5df8a687ff7f6d2c0f0a9e65364bb4955"} Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.242631 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e88d9f98f070feac97e35a343325bd5df8a687ff7f6d2c0f0a9e65364bb4955" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.278245 4830 scope.go:117] "RemoveContainer" containerID="738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.292029 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.312138 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.327053 4830 scope.go:117] "RemoveContainer" containerID="e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea" Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.327571 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea\": container with ID starting with e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea not found: ID does not exist" containerID="e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.327637 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea"} err="failed to get container status \"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea\": rpc error: code = NotFound desc = could not find container \"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea\": container with ID starting with e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea not found: ID does not exist" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.327672 4830 scope.go:117] "RemoveContainer" containerID="738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a" Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.327960 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a\": container with ID starting with 738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a not found: ID does not exist" containerID="738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.327989 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a"} err="failed to get container status \"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a\": rpc error: code = NotFound desc = could not find container \"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a\": container with ID starting with 738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a not found: ID does not exist" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.328007 4830 scope.go:117] "RemoveContainer" containerID="e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.329986 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea"} err="failed to get container status \"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea\": rpc error: code = NotFound desc = could not find container \"e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea\": container with ID starting with e3895433cbfe3902ce4e12d325b4af9ccf2e27fa5c4111b40129fb774df5ceea not found: ID does not exist" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.330261 4830 scope.go:117] "RemoveContainer" containerID="738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.330561 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a"} err="failed to get container status \"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a\": rpc error: code = NotFound desc = could not find container \"738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a\": container with ID starting with 738225f61794d1898f11612d8bdfb82224fd9ac10e6ddaba76be0d3027e2ca2a not found: ID does not exist" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342088 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.342557 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-metadata" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342578 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-metadata" Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.342606 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a498e09-d7b4-4af2-bb1a-dff67b8ce005" containerName="nova-manage" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342614 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a498e09-d7b4-4af2-bb1a-dff67b8ce005" containerName="nova-manage" Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.342630 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-log" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342637 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-log" Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.342649 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5849038f-38d8-48c8-a4d8-70dc0166cdf9" containerName="nova-cell1-conductor-db-sync" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342657 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5849038f-38d8-48c8-a4d8-70dc0166cdf9" containerName="nova-cell1-conductor-db-sync" Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.342666 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerName="init" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342673 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerName="init" Mar 11 09:37:08 crc kubenswrapper[4830]: E0311 09:37:08.342699 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerName="dnsmasq-dns" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342707 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerName="dnsmasq-dns" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342924 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a498e09-d7b4-4af2-bb1a-dff67b8ce005" containerName="nova-manage" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342940 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-log" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342958 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" containerName="nova-metadata-metadata" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342970 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" containerName="dnsmasq-dns" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.342985 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5849038f-38d8-48c8-a4d8-70dc0166cdf9" containerName="nova-cell1-conductor-db-sync" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.344206 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.353099 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.354688 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.364394 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.369502 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.369838 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.370328 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.395914 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495776 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495833 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc7c45-10c9-4571-923a-4fb4b861657e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495865 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rtvt\" (UniqueName: \"kubernetes.io/projected/e6bc7c45-10c9-4571-923a-4fb4b861657e-kube-api-access-9rtvt\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495887 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc7c45-10c9-4571-923a-4fb4b861657e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495903 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-config-data\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495929 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e312c8-3719-4bea-8034-0958a37b831f-logs\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495960 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.495976 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p76jc\" (UniqueName: \"kubernetes.io/projected/81e312c8-3719-4bea-8034-0958a37b831f-kube-api-access-p76jc\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597268 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597354 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc7c45-10c9-4571-923a-4fb4b861657e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597386 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rtvt\" (UniqueName: \"kubernetes.io/projected/e6bc7c45-10c9-4571-923a-4fb4b861657e-kube-api-access-9rtvt\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597404 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc7c45-10c9-4571-923a-4fb4b861657e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597420 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-config-data\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e312c8-3719-4bea-8034-0958a37b831f-logs\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.597512 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p76jc\" (UniqueName: \"kubernetes.io/projected/81e312c8-3719-4bea-8034-0958a37b831f-kube-api-access-p76jc\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.598916 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e312c8-3719-4bea-8034-0958a37b831f-logs\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.603609 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.606951 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.610354 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6bc7c45-10c9-4571-923a-4fb4b861657e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.611161 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bc7c45-10c9-4571-923a-4fb4b861657e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.614392 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-config-data\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.617801 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p76jc\" (UniqueName: \"kubernetes.io/projected/81e312c8-3719-4bea-8034-0958a37b831f-kube-api-access-p76jc\") pod \"nova-metadata-0\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.625385 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rtvt\" (UniqueName: \"kubernetes.io/projected/e6bc7c45-10c9-4571-923a-4fb4b861657e-kube-api-access-9rtvt\") pod \"nova-cell1-conductor-0\" (UID: \"e6bc7c45-10c9-4571-923a-4fb4b861657e\") " pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.810572 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.822876 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.958104 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="471cb0dc-46e4-4ac5-b55a-dcb053cacb31" path="/var/lib/kubelet/pods/471cb0dc-46e4-4ac5-b55a-dcb053cacb31/volumes" Mar 11 09:37:08 crc kubenswrapper[4830]: I0311 09:37:08.958872 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7" path="/var/lib/kubelet/pods/f7bd13ed-7aaf-420d-8ac8-90ac86e21cc7/volumes" Mar 11 09:37:09 crc kubenswrapper[4830]: I0311 09:37:09.286669 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 11 09:37:09 crc kubenswrapper[4830]: W0311 09:37:09.293883 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6bc7c45_10c9_4571_923a_4fb4b861657e.slice/crio-ace6d8dbc5563a44e624a6b7a2b75fc6769f9607a550c84ad63b08b50e8855c9 WatchSource:0}: Error finding container ace6d8dbc5563a44e624a6b7a2b75fc6769f9607a550c84ad63b08b50e8855c9: Status 404 returned error can't find the container with id ace6d8dbc5563a44e624a6b7a2b75fc6769f9607a550c84ad63b08b50e8855c9 Mar 11 09:37:09 crc kubenswrapper[4830]: I0311 09:37:09.351147 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:09 crc kubenswrapper[4830]: W0311 09:37:09.352265 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e312c8_3719_4bea_8034_0958a37b831f.slice/crio-cefb9660fb012010f3cff2b2be52e164191cee11766b895f06cb602173143e96 WatchSource:0}: Error finding container cefb9660fb012010f3cff2b2be52e164191cee11766b895f06cb602173143e96: Status 404 returned error can't find the container with id cefb9660fb012010f3cff2b2be52e164191cee11766b895f06cb602173143e96 Mar 11 09:37:09 crc kubenswrapper[4830]: I0311 09:37:09.623896 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:37:09 crc kubenswrapper[4830]: I0311 09:37:09.624629 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b5731a37-0030-4920-b5c2-ded8262d8e2a" containerName="kube-state-metrics" containerID="cri-o://5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08" gracePeriod=30 Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.132231 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.226531 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qf6b\" (UniqueName: \"kubernetes.io/projected/b5731a37-0030-4920-b5c2-ded8262d8e2a-kube-api-access-8qf6b\") pod \"b5731a37-0030-4920-b5c2-ded8262d8e2a\" (UID: \"b5731a37-0030-4920-b5c2-ded8262d8e2a\") " Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.231601 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5731a37-0030-4920-b5c2-ded8262d8e2a-kube-api-access-8qf6b" (OuterVolumeSpecName: "kube-api-access-8qf6b") pod "b5731a37-0030-4920-b5c2-ded8262d8e2a" (UID: "b5731a37-0030-4920-b5c2-ded8262d8e2a"). InnerVolumeSpecName "kube-api-access-8qf6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.260895 4830 generic.go:334] "Generic (PLEG): container finished" podID="b5731a37-0030-4920-b5c2-ded8262d8e2a" containerID="5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08" exitCode=2 Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.261168 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5731a37-0030-4920-b5c2-ded8262d8e2a","Type":"ContainerDied","Data":"5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08"} Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.261254 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b5731a37-0030-4920-b5c2-ded8262d8e2a","Type":"ContainerDied","Data":"3518723c18be813832176b3346f3ce971ac34c6747eb2eda4df787b0e2df25c1"} Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.261321 4830 scope.go:117] "RemoveContainer" containerID="5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.261497 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.265586 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e6bc7c45-10c9-4571-923a-4fb4b861657e","Type":"ContainerStarted","Data":"f57e7c4e7221a672211a2e9723d4b662ac064a928cfe3dd14abcc4ee93c0adc1"} Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.265634 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e6bc7c45-10c9-4571-923a-4fb4b861657e","Type":"ContainerStarted","Data":"ace6d8dbc5563a44e624a6b7a2b75fc6769f9607a550c84ad63b08b50e8855c9"} Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.265748 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.270807 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81e312c8-3719-4bea-8034-0958a37b831f","Type":"ContainerStarted","Data":"055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741"} Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.270988 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81e312c8-3719-4bea-8034-0958a37b831f","Type":"ContainerStarted","Data":"fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea"} Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.271070 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81e312c8-3719-4bea-8034-0958a37b831f","Type":"ContainerStarted","Data":"cefb9660fb012010f3cff2b2be52e164191cee11766b895f06cb602173143e96"} Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.318622 4830 scope.go:117] "RemoveContainer" containerID="5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08" Mar 11 09:37:10 crc kubenswrapper[4830]: E0311 09:37:10.319823 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08\": container with ID starting with 5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08 not found: ID does not exist" containerID="5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.319882 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08"} err="failed to get container status \"5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08\": rpc error: code = NotFound desc = could not find container \"5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08\": container with ID starting with 5723bdee7aef14c3126b90b2454a67933e9f065a3fee9318a0aa85ac0a61eb08 not found: ID does not exist" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.323384 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.323347369 podStartE2EDuration="2.323347369s" podCreationTimestamp="2026-03-11 09:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:10.285571552 +0000 UTC m=+1398.066722251" watchObservedRunningTime="2026-03-11 09:37:10.323347369 +0000 UTC m=+1398.104498058" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.328671 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.328652226 podStartE2EDuration="2.328652226s" podCreationTimestamp="2026-03-11 09:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:10.307064057 +0000 UTC m=+1398.088214766" watchObservedRunningTime="2026-03-11 09:37:10.328652226 +0000 UTC m=+1398.109802905" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.328894 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qf6b\" (UniqueName: \"kubernetes.io/projected/b5731a37-0030-4920-b5c2-ded8262d8e2a-kube-api-access-8qf6b\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.346494 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.358066 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.365841 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:37:10 crc kubenswrapper[4830]: E0311 09:37:10.366406 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5731a37-0030-4920-b5c2-ded8262d8e2a" containerName="kube-state-metrics" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.366427 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5731a37-0030-4920-b5c2-ded8262d8e2a" containerName="kube-state-metrics" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.366613 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5731a37-0030-4920-b5c2-ded8262d8e2a" containerName="kube-state-metrics" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.367307 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.369525 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.370792 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.384252 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.431715 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.431791 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzdhb\" (UniqueName: \"kubernetes.io/projected/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-api-access-qzdhb\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.431888 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.431915 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.533395 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.533440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.533531 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.533608 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzdhb\" (UniqueName: \"kubernetes.io/projected/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-api-access-qzdhb\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.539623 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.539882 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.540693 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85b49bc-4607-4852-9fce-dcf43af1069f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.564630 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzdhb\" (UniqueName: \"kubernetes.io/projected/e85b49bc-4607-4852-9fce-dcf43af1069f-kube-api-access-qzdhb\") pod \"kube-state-metrics-0\" (UID: \"e85b49bc-4607-4852-9fce-dcf43af1069f\") " pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.687947 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 11 09:37:10 crc kubenswrapper[4830]: E0311 09:37:10.848198 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 09:37:10 crc kubenswrapper[4830]: E0311 09:37:10.850272 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 09:37:10 crc kubenswrapper[4830]: E0311 09:37:10.851872 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 09:37:10 crc kubenswrapper[4830]: E0311 09:37:10.851984 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c70fe727-3f0d-4884-8ec6-c841f019b453" containerName="nova-scheduler-scheduler" Mar 11 09:37:10 crc kubenswrapper[4830]: I0311 09:37:10.956462 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5731a37-0030-4920-b5c2-ded8262d8e2a" path="/var/lib/kubelet/pods/b5731a37-0030-4920-b5c2-ded8262d8e2a/volumes" Mar 11 09:37:11 crc kubenswrapper[4830]: I0311 09:37:11.144066 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 11 09:37:11 crc kubenswrapper[4830]: W0311 09:37:11.155436 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85b49bc_4607_4852_9fce_dcf43af1069f.slice/crio-c3a5cc536e44edcb911f904d23e67c30529b987612a6504fa2ece34526f3dde2 WatchSource:0}: Error finding container c3a5cc536e44edcb911f904d23e67c30529b987612a6504fa2ece34526f3dde2: Status 404 returned error can't find the container with id c3a5cc536e44edcb911f904d23e67c30529b987612a6504fa2ece34526f3dde2 Mar 11 09:37:11 crc kubenswrapper[4830]: I0311 09:37:11.280981 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e85b49bc-4607-4852-9fce-dcf43af1069f","Type":"ContainerStarted","Data":"c3a5cc536e44edcb911f904d23e67c30529b987612a6504fa2ece34526f3dde2"} Mar 11 09:37:11 crc kubenswrapper[4830]: I0311 09:37:11.519095 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:11 crc kubenswrapper[4830]: I0311 09:37:11.519407 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-central-agent" containerID="cri-o://9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f" gracePeriod=30 Mar 11 09:37:11 crc kubenswrapper[4830]: I0311 09:37:11.519719 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="proxy-httpd" containerID="cri-o://17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182" gracePeriod=30 Mar 11 09:37:11 crc kubenswrapper[4830]: I0311 09:37:11.519849 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="sg-core" containerID="cri-o://0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0" gracePeriod=30 Mar 11 09:37:11 crc kubenswrapper[4830]: I0311 09:37:11.519907 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-notification-agent" containerID="cri-o://a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe" gracePeriod=30 Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.301861 4830 generic.go:334] "Generic (PLEG): container finished" podID="c70fe727-3f0d-4884-8ec6-c841f019b453" containerID="d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba" exitCode=0 Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.302193 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70fe727-3f0d-4884-8ec6-c841f019b453","Type":"ContainerDied","Data":"d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba"} Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.303321 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e85b49bc-4607-4852-9fce-dcf43af1069f","Type":"ContainerStarted","Data":"2032b3375a5e3bda5da76c787051b327a8e80f5eb91f5768ae605512467e2840"} Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.304801 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.307465 4830 generic.go:334] "Generic (PLEG): container finished" podID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerID="17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182" exitCode=0 Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.307494 4830 generic.go:334] "Generic (PLEG): container finished" podID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerID="0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0" exitCode=2 Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.307503 4830 generic.go:334] "Generic (PLEG): container finished" podID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerID="9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f" exitCode=0 Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.307522 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerDied","Data":"17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182"} Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.307542 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerDied","Data":"0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0"} Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.307555 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerDied","Data":"9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f"} Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.331531 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.837603719 podStartE2EDuration="2.331512464s" podCreationTimestamp="2026-03-11 09:37:10 +0000 UTC" firstStartedPulling="2026-03-11 09:37:11.158633235 +0000 UTC m=+1398.939783924" lastFinishedPulling="2026-03-11 09:37:11.65254198 +0000 UTC m=+1399.433692669" observedRunningTime="2026-03-11 09:37:12.320370475 +0000 UTC m=+1400.101521164" watchObservedRunningTime="2026-03-11 09:37:12.331512464 +0000 UTC m=+1400.112663153" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.424354 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.569111 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mw7t\" (UniqueName: \"kubernetes.io/projected/c70fe727-3f0d-4884-8ec6-c841f019b453-kube-api-access-9mw7t\") pod \"c70fe727-3f0d-4884-8ec6-c841f019b453\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.569220 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-combined-ca-bundle\") pod \"c70fe727-3f0d-4884-8ec6-c841f019b453\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.569399 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-config-data\") pod \"c70fe727-3f0d-4884-8ec6-c841f019b453\" (UID: \"c70fe727-3f0d-4884-8ec6-c841f019b453\") " Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.574461 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70fe727-3f0d-4884-8ec6-c841f019b453-kube-api-access-9mw7t" (OuterVolumeSpecName: "kube-api-access-9mw7t") pod "c70fe727-3f0d-4884-8ec6-c841f019b453" (UID: "c70fe727-3f0d-4884-8ec6-c841f019b453"). InnerVolumeSpecName "kube-api-access-9mw7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.594583 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c70fe727-3f0d-4884-8ec6-c841f019b453" (UID: "c70fe727-3f0d-4884-8ec6-c841f019b453"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.607538 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-config-data" (OuterVolumeSpecName: "config-data") pod "c70fe727-3f0d-4884-8ec6-c841f019b453" (UID: "c70fe727-3f0d-4884-8ec6-c841f019b453"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.671487 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.671521 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mw7t\" (UniqueName: \"kubernetes.io/projected/c70fe727-3f0d-4884-8ec6-c841f019b453-kube-api-access-9mw7t\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:12 crc kubenswrapper[4830]: I0311 09:37:12.671531 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70fe727-3f0d-4884-8ec6-c841f019b453-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.060795 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.060897 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.060984 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.062447 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d51cf8acf1c408e7829c31a89fc6bf74196f438e8b371de9aaedaab30e9cfc5"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.062561 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://6d51cf8acf1c408e7829c31a89fc6bf74196f438e8b371de9aaedaab30e9cfc5" gracePeriod=600 Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.317527 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70fe727-3f0d-4884-8ec6-c841f019b453","Type":"ContainerDied","Data":"bd1a29b6d5c2ef1aa533f8bdc362d399056e5d09574c82eb308cc4d26497b300"} Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.317814 4830 scope.go:117] "RemoveContainer" containerID="d5d4afeb4990f835d0cd34977a6e9e780dee1fa9d33a0df30ac56c2de3ec16ba" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.317774 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.326523 4830 generic.go:334] "Generic (PLEG): container finished" podID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerID="6b8eb712f98e7f7c153f5ee3fe37d54c8457ea4ddff207f832dde3bb84ecaa9e" exitCode=0 Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.326588 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9c21203-bcf1-474e-abf5-18f188e0553a","Type":"ContainerDied","Data":"6b8eb712f98e7f7c153f5ee3fe37d54c8457ea4ddff207f832dde3bb84ecaa9e"} Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.326614 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d9c21203-bcf1-474e-abf5-18f188e0553a","Type":"ContainerDied","Data":"5a6bea82a5d8cfa88878064fbd299b17cd86912774c298b17a6f8cbe16a91bff"} Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.326623 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6bea82a5d8cfa88878064fbd299b17cd86912774c298b17a6f8cbe16a91bff" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.343742 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="6d51cf8acf1c408e7829c31a89fc6bf74196f438e8b371de9aaedaab30e9cfc5" exitCode=0 Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.344460 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"6d51cf8acf1c408e7829c31a89fc6bf74196f438e8b371de9aaedaab30e9cfc5"} Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.382466 4830 scope.go:117] "RemoveContainer" containerID="e06cb787da7fb0f6798e1465b6764e43fa4f19a8709a93ec236e4a0b85a72f7c" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.399276 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.419420 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.431001 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.439696 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:13 crc kubenswrapper[4830]: E0311 09:37:13.440171 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-api" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.440188 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-api" Mar 11 09:37:13 crc kubenswrapper[4830]: E0311 09:37:13.440213 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-log" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.440220 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-log" Mar 11 09:37:13 crc kubenswrapper[4830]: E0311 09:37:13.440232 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70fe727-3f0d-4884-8ec6-c841f019b453" containerName="nova-scheduler-scheduler" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.440240 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70fe727-3f0d-4884-8ec6-c841f019b453" containerName="nova-scheduler-scheduler" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.440412 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-log" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.440442 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" containerName="nova-api-api" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.440453 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70fe727-3f0d-4884-8ec6-c841f019b453" containerName="nova-scheduler-scheduler" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.441089 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.444194 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.457374 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.492934 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-config-data\") pod \"d9c21203-bcf1-474e-abf5-18f188e0553a\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.493048 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c21203-bcf1-474e-abf5-18f188e0553a-logs\") pod \"d9c21203-bcf1-474e-abf5-18f188e0553a\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.493121 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndhq4\" (UniqueName: \"kubernetes.io/projected/d9c21203-bcf1-474e-abf5-18f188e0553a-kube-api-access-ndhq4\") pod \"d9c21203-bcf1-474e-abf5-18f188e0553a\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.493144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-combined-ca-bundle\") pod \"d9c21203-bcf1-474e-abf5-18f188e0553a\" (UID: \"d9c21203-bcf1-474e-abf5-18f188e0553a\") " Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.493591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fcsg\" (UniqueName: \"kubernetes.io/projected/173258f5-abf0-4ddb-ba63-02eb03db5521-kube-api-access-6fcsg\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.493711 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-config-data\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.494291 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.495660 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c21203-bcf1-474e-abf5-18f188e0553a-logs" (OuterVolumeSpecName: "logs") pod "d9c21203-bcf1-474e-abf5-18f188e0553a" (UID: "d9c21203-bcf1-474e-abf5-18f188e0553a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.500922 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c21203-bcf1-474e-abf5-18f188e0553a-kube-api-access-ndhq4" (OuterVolumeSpecName: "kube-api-access-ndhq4") pod "d9c21203-bcf1-474e-abf5-18f188e0553a" (UID: "d9c21203-bcf1-474e-abf5-18f188e0553a"). InnerVolumeSpecName "kube-api-access-ndhq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.523443 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9c21203-bcf1-474e-abf5-18f188e0553a" (UID: "d9c21203-bcf1-474e-abf5-18f188e0553a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.525611 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-config-data" (OuterVolumeSpecName: "config-data") pod "d9c21203-bcf1-474e-abf5-18f188e0553a" (UID: "d9c21203-bcf1-474e-abf5-18f188e0553a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.595545 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fcsg\" (UniqueName: \"kubernetes.io/projected/173258f5-abf0-4ddb-ba63-02eb03db5521-kube-api-access-6fcsg\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.595614 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-config-data\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.595680 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.595727 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.595738 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c21203-bcf1-474e-abf5-18f188e0553a-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.595747 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndhq4\" (UniqueName: \"kubernetes.io/projected/d9c21203-bcf1-474e-abf5-18f188e0553a-kube-api-access-ndhq4\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.595756 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c21203-bcf1-474e-abf5-18f188e0553a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.599962 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-config-data\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.600090 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.615644 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fcsg\" (UniqueName: \"kubernetes.io/projected/173258f5-abf0-4ddb-ba63-02eb03db5521-kube-api-access-6fcsg\") pod \"nova-scheduler-0\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.768570 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.812528 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 09:37:13 crc kubenswrapper[4830]: I0311 09:37:13.813327 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.222599 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.356754 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f"} Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.360378 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.360401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"173258f5-abf0-4ddb-ba63-02eb03db5521","Type":"ContainerStarted","Data":"a4a9ebaeedc96798e9f967c2add25ab9c0afc4483124b5728c063491305234e0"} Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.407513 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.426803 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.446946 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.449012 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.454934 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.471064 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.516009 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.516081 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085b7222-f42d-47e7-bcdb-f3e41e3333b4-logs\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.516126 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847pt\" (UniqueName: \"kubernetes.io/projected/085b7222-f42d-47e7-bcdb-f3e41e3333b4-kube-api-access-847pt\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.516215 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-config-data\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.617889 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.617939 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085b7222-f42d-47e7-bcdb-f3e41e3333b4-logs\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.617967 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-847pt\" (UniqueName: \"kubernetes.io/projected/085b7222-f42d-47e7-bcdb-f3e41e3333b4-kube-api-access-847pt\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.618010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-config-data\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.619168 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085b7222-f42d-47e7-bcdb-f3e41e3333b4-logs\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.624620 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-config-data\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.624656 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.636442 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-847pt\" (UniqueName: \"kubernetes.io/projected/085b7222-f42d-47e7-bcdb-f3e41e3333b4-kube-api-access-847pt\") pod \"nova-api-0\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.767181 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.943663 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70fe727-3f0d-4884-8ec6-c841f019b453" path="/var/lib/kubelet/pods/c70fe727-3f0d-4884-8ec6-c841f019b453/volumes" Mar 11 09:37:14 crc kubenswrapper[4830]: I0311 09:37:14.944514 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c21203-bcf1-474e-abf5-18f188e0553a" path="/var/lib/kubelet/pods/d9c21203-bcf1-474e-abf5-18f188e0553a/volumes" Mar 11 09:37:15 crc kubenswrapper[4830]: I0311 09:37:15.257235 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:15 crc kubenswrapper[4830]: I0311 09:37:15.374219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"085b7222-f42d-47e7-bcdb-f3e41e3333b4","Type":"ContainerStarted","Data":"787727fafc980463e56bbaaff4f3c244b9f3f27e046c6112b139fda6de5ef241"} Mar 11 09:37:15 crc kubenswrapper[4830]: I0311 09:37:15.377526 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"173258f5-abf0-4ddb-ba63-02eb03db5521","Type":"ContainerStarted","Data":"8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271"} Mar 11 09:37:16 crc kubenswrapper[4830]: I0311 09:37:16.387960 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"085b7222-f42d-47e7-bcdb-f3e41e3333b4","Type":"ContainerStarted","Data":"a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724"} Mar 11 09:37:16 crc kubenswrapper[4830]: I0311 09:37:16.388504 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"085b7222-f42d-47e7-bcdb-f3e41e3333b4","Type":"ContainerStarted","Data":"a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14"} Mar 11 09:37:16 crc kubenswrapper[4830]: I0311 09:37:16.415861 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.415838498 podStartE2EDuration="2.415838498s" podCreationTimestamp="2026-03-11 09:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:16.412530377 +0000 UTC m=+1404.193681086" watchObservedRunningTime="2026-03-11 09:37:16.415838498 +0000 UTC m=+1404.196989197" Mar 11 09:37:16 crc kubenswrapper[4830]: I0311 09:37:16.417169 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.417162365 podStartE2EDuration="3.417162365s" podCreationTimestamp="2026-03-11 09:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:15.396558435 +0000 UTC m=+1403.177709134" watchObservedRunningTime="2026-03-11 09:37:16.417162365 +0000 UTC m=+1404.198313054" Mar 11 09:37:18 crc kubenswrapper[4830]: I0311 09:37:18.769052 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 09:37:18 crc kubenswrapper[4830]: I0311 09:37:18.811421 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 09:37:18 crc kubenswrapper[4830]: I0311 09:37:18.811500 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 09:37:18 crc kubenswrapper[4830]: I0311 09:37:18.859865 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 11 09:37:19 crc kubenswrapper[4830]: I0311 09:37:19.825271 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 09:37:19 crc kubenswrapper[4830]: I0311 09:37:19.825301 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:37:20 crc kubenswrapper[4830]: I0311 09:37:20.788069 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.044453 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.190300 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-sg-core-conf-yaml\") pod \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.190346 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-run-httpd\") pod \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.190380 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-log-httpd\") pod \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.190448 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-combined-ca-bundle\") pod \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.190497 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-scripts\") pod \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.190537 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-config-data\") pod \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.190574 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d9t7\" (UniqueName: \"kubernetes.io/projected/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-kube-api-access-5d9t7\") pod \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\" (UID: \"f71af19d-ca84-46f9-94d1-7ff4a6b1f861\") " Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.196995 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f71af19d-ca84-46f9-94d1-7ff4a6b1f861" (UID: "f71af19d-ca84-46f9-94d1-7ff4a6b1f861"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.197919 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f71af19d-ca84-46f9-94d1-7ff4a6b1f861" (UID: "f71af19d-ca84-46f9-94d1-7ff4a6b1f861"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.215667 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-scripts" (OuterVolumeSpecName: "scripts") pod "f71af19d-ca84-46f9-94d1-7ff4a6b1f861" (UID: "f71af19d-ca84-46f9-94d1-7ff4a6b1f861"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.215709 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-kube-api-access-5d9t7" (OuterVolumeSpecName: "kube-api-access-5d9t7") pod "f71af19d-ca84-46f9-94d1-7ff4a6b1f861" (UID: "f71af19d-ca84-46f9-94d1-7ff4a6b1f861"). InnerVolumeSpecName "kube-api-access-5d9t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.237125 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f71af19d-ca84-46f9-94d1-7ff4a6b1f861" (UID: "f71af19d-ca84-46f9-94d1-7ff4a6b1f861"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.292447 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.292492 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.292508 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.292520 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.292534 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d9t7\" (UniqueName: \"kubernetes.io/projected/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-kube-api-access-5d9t7\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.331132 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f71af19d-ca84-46f9-94d1-7ff4a6b1f861" (UID: "f71af19d-ca84-46f9-94d1-7ff4a6b1f861"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.346106 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-config-data" (OuterVolumeSpecName: "config-data") pod "f71af19d-ca84-46f9-94d1-7ff4a6b1f861" (UID: "f71af19d-ca84-46f9-94d1-7ff4a6b1f861"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.393919 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.393960 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71af19d-ca84-46f9-94d1-7ff4a6b1f861-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.435040 4830 generic.go:334] "Generic (PLEG): container finished" podID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerID="a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe" exitCode=0 Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.435084 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerDied","Data":"a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe"} Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.435110 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f71af19d-ca84-46f9-94d1-7ff4a6b1f861","Type":"ContainerDied","Data":"738c51d8ee94576774ae5f28a9e0058f2bcc38d1c10c400da4cf079dcd6bd1f1"} Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.435126 4830 scope.go:117] "RemoveContainer" containerID="17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.435155 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.463934 4830 scope.go:117] "RemoveContainer" containerID="0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.483302 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.493483 4830 scope.go:117] "RemoveContainer" containerID="a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.508798 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516082 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.516427 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-notification-agent" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516444 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-notification-agent" Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.516458 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="sg-core" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516465 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="sg-core" Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.516473 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="proxy-httpd" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516479 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="proxy-httpd" Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.516500 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-central-agent" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516506 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-central-agent" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516668 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="sg-core" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516684 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-notification-agent" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516698 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="proxy-httpd" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.516710 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" containerName="ceilometer-central-agent" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.518604 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.522001 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.522075 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.522198 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.522554 4830 scope.go:117] "RemoveContainer" containerID="9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.528472 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.570907 4830 scope.go:117] "RemoveContainer" containerID="17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182" Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.571423 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182\": container with ID starting with 17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182 not found: ID does not exist" containerID="17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.571472 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182"} err="failed to get container status \"17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182\": rpc error: code = NotFound desc = could not find container \"17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182\": container with ID starting with 17856db48604af2c016361579cd6ce3eb1c90692de0c8f94157854c7ef5ca182 not found: ID does not exist" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.571517 4830 scope.go:117] "RemoveContainer" containerID="0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0" Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.572525 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0\": container with ID starting with 0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0 not found: ID does not exist" containerID="0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.572550 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0"} err="failed to get container status \"0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0\": rpc error: code = NotFound desc = could not find container \"0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0\": container with ID starting with 0106857ff02a6f359327c0a092de7812f56ed3df5ff4f05c5b1a87c7b8b489e0 not found: ID does not exist" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.572564 4830 scope.go:117] "RemoveContainer" containerID="a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe" Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.572798 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe\": container with ID starting with a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe not found: ID does not exist" containerID="a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.572822 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe"} err="failed to get container status \"a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe\": rpc error: code = NotFound desc = could not find container \"a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe\": container with ID starting with a29d9665025ec7bcbc520c8f509f45367d05111384c3f4fa8b4d664f719151fe not found: ID does not exist" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.572836 4830 scope.go:117] "RemoveContainer" containerID="9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f" Mar 11 09:37:21 crc kubenswrapper[4830]: E0311 09:37:21.573138 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f\": container with ID starting with 9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f not found: ID does not exist" containerID="9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.573162 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f"} err="failed to get container status \"9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f\": rpc error: code = NotFound desc = could not find container \"9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f\": container with ID starting with 9bd6cbe46735b037c5dc176098ec3b6ee4ad9562ba60443ae1b84e8c2e12506f not found: ID does not exist" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597086 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597165 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-run-httpd\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597185 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45q84\" (UniqueName: \"kubernetes.io/projected/193c55e6-58a4-4b42-9692-cf7d6f555333-kube-api-access-45q84\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597209 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-log-httpd\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597249 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-config-data\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597266 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597287 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-scripts\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.597322 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699355 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699709 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699772 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-run-httpd\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699794 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45q84\" (UniqueName: \"kubernetes.io/projected/193c55e6-58a4-4b42-9692-cf7d6f555333-kube-api-access-45q84\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699820 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-log-httpd\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699844 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-config-data\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699866 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.699901 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-scripts\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.705360 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-run-httpd\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.706159 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-scripts\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.706285 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-log-httpd\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.724220 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.724251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.729231 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.734915 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45q84\" (UniqueName: \"kubernetes.io/projected/193c55e6-58a4-4b42-9692-cf7d6f555333-kube-api-access-45q84\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.747394 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-config-data\") pod \"ceilometer-0\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " pod="openstack/ceilometer-0" Mar 11 09:37:21 crc kubenswrapper[4830]: I0311 09:37:21.848074 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:22 crc kubenswrapper[4830]: I0311 09:37:22.343155 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:22 crc kubenswrapper[4830]: I0311 09:37:22.445602 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerStarted","Data":"c59eedd392e5b5ba40f489143da86eb2fae70fd67d93380f6a75ba7c5699488f"} Mar 11 09:37:22 crc kubenswrapper[4830]: I0311 09:37:22.949603 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71af19d-ca84-46f9-94d1-7ff4a6b1f861" path="/var/lib/kubelet/pods/f71af19d-ca84-46f9-94d1-7ff4a6b1f861/volumes" Mar 11 09:37:23 crc kubenswrapper[4830]: I0311 09:37:23.456131 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerStarted","Data":"c13cbff24bc806ad2bbc1c368c70d15e4ad5373fed9df8de316f27efbb65abe0"} Mar 11 09:37:23 crc kubenswrapper[4830]: I0311 09:37:23.768941 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 09:37:23 crc kubenswrapper[4830]: I0311 09:37:23.804690 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 09:37:24 crc kubenswrapper[4830]: I0311 09:37:24.467157 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerStarted","Data":"d636c03378ee5d37e67e81eab952ad6f9aa453e460316014a941b5dd62e2778a"} Mar 11 09:37:24 crc kubenswrapper[4830]: I0311 09:37:24.501614 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 09:37:24 crc kubenswrapper[4830]: I0311 09:37:24.768376 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 09:37:24 crc kubenswrapper[4830]: I0311 09:37:24.768709 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 09:37:25 crc kubenswrapper[4830]: I0311 09:37:25.478471 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerStarted","Data":"d0c0991b4d050af3eb3294ed45c785bd5352a9d79f14e037c90a9b50b0da4725"} Mar 11 09:37:25 crc kubenswrapper[4830]: I0311 09:37:25.852176 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:37:25 crc kubenswrapper[4830]: I0311 09:37:25.852178 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 11 09:37:28 crc kubenswrapper[4830]: I0311 09:37:28.504847 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerStarted","Data":"61faafb2893251f9dc50ac5b42478d9119453a2e90dfb9b2c101de96b68474fe"} Mar 11 09:37:28 crc kubenswrapper[4830]: I0311 09:37:28.505386 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 09:37:28 crc kubenswrapper[4830]: I0311 09:37:28.531232 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.479533179 podStartE2EDuration="7.531213629s" podCreationTimestamp="2026-03-11 09:37:21 +0000 UTC" firstStartedPulling="2026-03-11 09:37:22.351005348 +0000 UTC m=+1410.132156037" lastFinishedPulling="2026-03-11 09:37:27.402685798 +0000 UTC m=+1415.183836487" observedRunningTime="2026-03-11 09:37:28.529462281 +0000 UTC m=+1416.310612990" watchObservedRunningTime="2026-03-11 09:37:28.531213629 +0000 UTC m=+1416.312364318" Mar 11 09:37:28 crc kubenswrapper[4830]: I0311 09:37:28.818126 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 09:37:28 crc kubenswrapper[4830]: I0311 09:37:28.824811 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 09:37:28 crc kubenswrapper[4830]: I0311 09:37:28.825882 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 09:37:29 crc kubenswrapper[4830]: I0311 09:37:29.522454 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.533632 4830 generic.go:334] "Generic (PLEG): container finished" podID="52008989-6ec1-41f8-99b6-7daeb3591033" containerID="9f7699a8975d2876363c67476e644a5ef45ba1160b67faaad9196da93584601f" exitCode=137 Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.533711 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52008989-6ec1-41f8-99b6-7daeb3591033","Type":"ContainerDied","Data":"9f7699a8975d2876363c67476e644a5ef45ba1160b67faaad9196da93584601f"} Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.534180 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52008989-6ec1-41f8-99b6-7daeb3591033","Type":"ContainerDied","Data":"d44fb77c65da46d64bbbc6c15d2e704f6f4d3d58d391e6b127c1c3b91f1ab781"} Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.534193 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d44fb77c65da46d64bbbc6c15d2e704f6f4d3d58d391e6b127c1c3b91f1ab781" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.543863 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.628905 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-combined-ca-bundle\") pod \"52008989-6ec1-41f8-99b6-7daeb3591033\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.629061 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlmk8\" (UniqueName: \"kubernetes.io/projected/52008989-6ec1-41f8-99b6-7daeb3591033-kube-api-access-hlmk8\") pod \"52008989-6ec1-41f8-99b6-7daeb3591033\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.629098 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-config-data\") pod \"52008989-6ec1-41f8-99b6-7daeb3591033\" (UID: \"52008989-6ec1-41f8-99b6-7daeb3591033\") " Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.639648 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52008989-6ec1-41f8-99b6-7daeb3591033-kube-api-access-hlmk8" (OuterVolumeSpecName: "kube-api-access-hlmk8") pod "52008989-6ec1-41f8-99b6-7daeb3591033" (UID: "52008989-6ec1-41f8-99b6-7daeb3591033"). InnerVolumeSpecName "kube-api-access-hlmk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.662265 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-config-data" (OuterVolumeSpecName: "config-data") pod "52008989-6ec1-41f8-99b6-7daeb3591033" (UID: "52008989-6ec1-41f8-99b6-7daeb3591033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.685580 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52008989-6ec1-41f8-99b6-7daeb3591033" (UID: "52008989-6ec1-41f8-99b6-7daeb3591033"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.730928 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlmk8\" (UniqueName: \"kubernetes.io/projected/52008989-6ec1-41f8-99b6-7daeb3591033-kube-api-access-hlmk8\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.730960 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:31 crc kubenswrapper[4830]: I0311 09:37:31.730971 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52008989-6ec1-41f8-99b6-7daeb3591033-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.545752 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.590809 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.603836 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.619130 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:37:32 crc kubenswrapper[4830]: E0311 09:37:32.619617 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52008989-6ec1-41f8-99b6-7daeb3591033" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.619640 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="52008989-6ec1-41f8-99b6-7daeb3591033" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.620057 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="52008989-6ec1-41f8-99b6-7daeb3591033" containerName="nova-cell1-novncproxy-novncproxy" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.620897 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.626061 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.626378 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.627413 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.645831 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.751313 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.751405 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.751453 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czv8z\" (UniqueName: \"kubernetes.io/projected/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-kube-api-access-czv8z\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.751503 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.751707 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.853706 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.853835 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.854070 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.854137 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.854195 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czv8z\" (UniqueName: \"kubernetes.io/projected/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-kube-api-access-czv8z\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.860719 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.860930 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.861857 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.862891 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.876106 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czv8z\" (UniqueName: \"kubernetes.io/projected/f690df7f-ca68-4a5d-8e9e-4d7d55df4773-kube-api-access-czv8z\") pod \"nova-cell1-novncproxy-0\" (UID: \"f690df7f-ca68-4a5d-8e9e-4d7d55df4773\") " pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.941837 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:32 crc kubenswrapper[4830]: I0311 09:37:32.958959 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52008989-6ec1-41f8-99b6-7daeb3591033" path="/var/lib/kubelet/pods/52008989-6ec1-41f8-99b6-7daeb3591033/volumes" Mar 11 09:37:33 crc kubenswrapper[4830]: I0311 09:37:33.442621 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 11 09:37:33 crc kubenswrapper[4830]: W0311 09:37:33.444520 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf690df7f_ca68_4a5d_8e9e_4d7d55df4773.slice/crio-fcbb44a7243ce8967d14f295c97f132e5b4986ef0d1717b4c93950f2f3e9fced WatchSource:0}: Error finding container fcbb44a7243ce8967d14f295c97f132e5b4986ef0d1717b4c93950f2f3e9fced: Status 404 returned error can't find the container with id fcbb44a7243ce8967d14f295c97f132e5b4986ef0d1717b4c93950f2f3e9fced Mar 11 09:37:33 crc kubenswrapper[4830]: I0311 09:37:33.557427 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f690df7f-ca68-4a5d-8e9e-4d7d55df4773","Type":"ContainerStarted","Data":"fcbb44a7243ce8967d14f295c97f132e5b4986ef0d1717b4c93950f2f3e9fced"} Mar 11 09:37:34 crc kubenswrapper[4830]: I0311 09:37:34.571339 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f690df7f-ca68-4a5d-8e9e-4d7d55df4773","Type":"ContainerStarted","Data":"fb3939f495a736baac78b4602b0073fb3b468e8f6c366bfceb7282ee70c96d0c"} Mar 11 09:37:34 crc kubenswrapper[4830]: I0311 09:37:34.593818 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.593803149 podStartE2EDuration="2.593803149s" podCreationTimestamp="2026-03-11 09:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:34.593722387 +0000 UTC m=+1422.374873126" watchObservedRunningTime="2026-03-11 09:37:34.593803149 +0000 UTC m=+1422.374953838" Mar 11 09:37:34 crc kubenswrapper[4830]: I0311 09:37:34.773521 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 09:37:34 crc kubenswrapper[4830]: I0311 09:37:34.774077 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 09:37:34 crc kubenswrapper[4830]: I0311 09:37:34.774265 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 09:37:34 crc kubenswrapper[4830]: I0311 09:37:34.776835 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.579630 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.582790 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.766217 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-dfj4r"] Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.784540 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.813871 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-config\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.813918 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkm5\" (UniqueName: \"kubernetes.io/projected/6edd0f6a-66c6-491e-9db5-0c9f617709d5-kube-api-access-crkm5\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.813981 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.814000 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.814049 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.814158 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.836217 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-dfj4r"] Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.917099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-config\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.917155 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkm5\" (UniqueName: \"kubernetes.io/projected/6edd0f6a-66c6-491e-9db5-0c9f617709d5-kube-api-access-crkm5\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.917585 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.917614 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.917996 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-config\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.918322 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.918470 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.918512 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.919183 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-svc\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.919767 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.919264 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:35 crc kubenswrapper[4830]: I0311 09:37:35.959436 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkm5\" (UniqueName: \"kubernetes.io/projected/6edd0f6a-66c6-491e-9db5-0c9f617709d5-kube-api-access-crkm5\") pod \"dnsmasq-dns-5c7b6c5df9-dfj4r\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:36 crc kubenswrapper[4830]: I0311 09:37:36.115860 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:36 crc kubenswrapper[4830]: I0311 09:37:36.571869 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-dfj4r"] Mar 11 09:37:36 crc kubenswrapper[4830]: W0311 09:37:36.575452 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6edd0f6a_66c6_491e_9db5_0c9f617709d5.slice/crio-f0da355977a72a9054aed548ebff2bc856c3c41a10574120d37f0748ae2c4282 WatchSource:0}: Error finding container f0da355977a72a9054aed548ebff2bc856c3c41a10574120d37f0748ae2c4282: Status 404 returned error can't find the container with id f0da355977a72a9054aed548ebff2bc856c3c41a10574120d37f0748ae2c4282 Mar 11 09:37:36 crc kubenswrapper[4830]: I0311 09:37:36.595975 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" event={"ID":"6edd0f6a-66c6-491e-9db5-0c9f617709d5","Type":"ContainerStarted","Data":"f0da355977a72a9054aed548ebff2bc856c3c41a10574120d37f0748ae2c4282"} Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.605992 4830 generic.go:334] "Generic (PLEG): container finished" podID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerID="0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b" exitCode=0 Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.606073 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" event={"ID":"6edd0f6a-66c6-491e-9db5-0c9f617709d5","Type":"ContainerDied","Data":"0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b"} Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.791471 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.792224 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="sg-core" containerID="cri-o://d0c0991b4d050af3eb3294ed45c785bd5352a9d79f14e037c90a9b50b0da4725" gracePeriod=30 Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.792260 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="proxy-httpd" containerID="cri-o://61faafb2893251f9dc50ac5b42478d9119453a2e90dfb9b2c101de96b68474fe" gracePeriod=30 Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.792349 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-notification-agent" containerID="cri-o://d636c03378ee5d37e67e81eab952ad6f9aa453e460316014a941b5dd62e2778a" gracePeriod=30 Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.792717 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-central-agent" containerID="cri-o://c13cbff24bc806ad2bbc1c368c70d15e4ad5373fed9df8de316f27efbb65abe0" gracePeriod=30 Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.797595 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.201:3000/\": read tcp 10.217.0.2:45198->10.217.0.201:3000: read: connection reset by peer" Mar 11 09:37:37 crc kubenswrapper[4830]: I0311 09:37:37.945712 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.149419 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631178 4830 generic.go:334] "Generic (PLEG): container finished" podID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerID="61faafb2893251f9dc50ac5b42478d9119453a2e90dfb9b2c101de96b68474fe" exitCode=0 Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631477 4830 generic.go:334] "Generic (PLEG): container finished" podID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerID="d0c0991b4d050af3eb3294ed45c785bd5352a9d79f14e037c90a9b50b0da4725" exitCode=2 Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631484 4830 generic.go:334] "Generic (PLEG): container finished" podID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerID="d636c03378ee5d37e67e81eab952ad6f9aa453e460316014a941b5dd62e2778a" exitCode=0 Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631493 4830 generic.go:334] "Generic (PLEG): container finished" podID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerID="c13cbff24bc806ad2bbc1c368c70d15e4ad5373fed9df8de316f27efbb65abe0" exitCode=0 Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631721 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerDied","Data":"61faafb2893251f9dc50ac5b42478d9119453a2e90dfb9b2c101de96b68474fe"} Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631752 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerDied","Data":"d0c0991b4d050af3eb3294ed45c785bd5352a9d79f14e037c90a9b50b0da4725"} Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631763 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerDied","Data":"d636c03378ee5d37e67e81eab952ad6f9aa453e460316014a941b5dd62e2778a"} Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.631774 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerDied","Data":"c13cbff24bc806ad2bbc1c368c70d15e4ad5373fed9df8de316f27efbb65abe0"} Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.633982 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-log" containerID="cri-o://a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14" gracePeriod=30 Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.635092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" event={"ID":"6edd0f6a-66c6-491e-9db5-0c9f617709d5","Type":"ContainerStarted","Data":"83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d"} Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.635156 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-api" containerID="cri-o://a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724" gracePeriod=30 Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.635466 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.665240 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" podStartSLOduration=3.665220025 podStartE2EDuration="3.665220025s" podCreationTimestamp="2026-03-11 09:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:38.655566689 +0000 UTC m=+1426.436717378" watchObservedRunningTime="2026-03-11 09:37:38.665220025 +0000 UTC m=+1426.446370714" Mar 11 09:37:38 crc kubenswrapper[4830]: I0311 09:37:38.903755 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083521 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-config-data\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083573 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-run-httpd\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083602 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-combined-ca-bundle\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083642 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-log-httpd\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083663 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45q84\" (UniqueName: \"kubernetes.io/projected/193c55e6-58a4-4b42-9692-cf7d6f555333-kube-api-access-45q84\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083685 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-sg-core-conf-yaml\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083790 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-scripts\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.083878 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-ceilometer-tls-certs\") pod \"193c55e6-58a4-4b42-9692-cf7d6f555333\" (UID: \"193c55e6-58a4-4b42-9692-cf7d6f555333\") " Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.085191 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.085411 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.096079 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-scripts" (OuterVolumeSpecName: "scripts") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.097647 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193c55e6-58a4-4b42-9692-cf7d6f555333-kube-api-access-45q84" (OuterVolumeSpecName: "kube-api-access-45q84") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "kube-api-access-45q84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.132993 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.185496 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.186183 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.186209 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.186221 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/193c55e6-58a4-4b42-9692-cf7d6f555333-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.186232 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45q84\" (UniqueName: \"kubernetes.io/projected/193c55e6-58a4-4b42-9692-cf7d6f555333-kube-api-access-45q84\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.186243 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.186254 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.220349 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.233475 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-config-data" (OuterVolumeSpecName: "config-data") pod "193c55e6-58a4-4b42-9692-cf7d6f555333" (UID: "193c55e6-58a4-4b42-9692-cf7d6f555333"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.288031 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.288065 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193c55e6-58a4-4b42-9692-cf7d6f555333-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.643733 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"193c55e6-58a4-4b42-9692-cf7d6f555333","Type":"ContainerDied","Data":"c59eedd392e5b5ba40f489143da86eb2fae70fd67d93380f6a75ba7c5699488f"} Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.643795 4830 scope.go:117] "RemoveContainer" containerID="61faafb2893251f9dc50ac5b42478d9119453a2e90dfb9b2c101de96b68474fe" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.643826 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.648170 4830 generic.go:334] "Generic (PLEG): container finished" podID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerID="a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14" exitCode=143 Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.648204 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"085b7222-f42d-47e7-bcdb-f3e41e3333b4","Type":"ContainerDied","Data":"a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14"} Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.666342 4830 scope.go:117] "RemoveContainer" containerID="d0c0991b4d050af3eb3294ed45c785bd5352a9d79f14e037c90a9b50b0da4725" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.688250 4830 scope.go:117] "RemoveContainer" containerID="d636c03378ee5d37e67e81eab952ad6f9aa453e460316014a941b5dd62e2778a" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.696042 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.707169 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.710009 4830 scope.go:117] "RemoveContainer" containerID="c13cbff24bc806ad2bbc1c368c70d15e4ad5373fed9df8de316f27efbb65abe0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.748063 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:39 crc kubenswrapper[4830]: E0311 09:37:39.748753 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="proxy-httpd" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.748770 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="proxy-httpd" Mar 11 09:37:39 crc kubenswrapper[4830]: E0311 09:37:39.748798 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-central-agent" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.748804 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-central-agent" Mar 11 09:37:39 crc kubenswrapper[4830]: E0311 09:37:39.748819 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="sg-core" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.748825 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="sg-core" Mar 11 09:37:39 crc kubenswrapper[4830]: E0311 09:37:39.748843 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-notification-agent" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.748850 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-notification-agent" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.749504 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="sg-core" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.749537 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="proxy-httpd" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.749556 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-central-agent" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.749583 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" containerName="ceilometer-notification-agent" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.752863 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.760412 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.761644 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.761539 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.762092 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.797192 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.797519 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.797653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpmmd\" (UniqueName: \"kubernetes.io/projected/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-kube-api-access-mpmmd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.797742 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-scripts\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.797970 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.798170 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.798308 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.798435 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-config-data\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.824487 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:39 crc kubenswrapper[4830]: E0311 09:37:39.828643 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-mpmmd log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="ef184b0c-47d3-407a-bcae-e8ec605e9cdd" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.900529 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.900878 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.901372 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-config-data\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.901853 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.901943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.902045 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpmmd\" (UniqueName: \"kubernetes.io/projected/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-kube-api-access-mpmmd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.902125 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-scripts\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.902207 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.902278 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-log-httpd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.902653 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-run-httpd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.904633 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.905457 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.905727 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-scripts\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.905766 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-config-data\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.908689 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:39 crc kubenswrapper[4830]: I0311 09:37:39.920975 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpmmd\" (UniqueName: \"kubernetes.io/projected/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-kube-api-access-mpmmd\") pod \"ceilometer-0\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " pod="openstack/ceilometer-0" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.656469 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.672402 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.715859 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-sg-core-conf-yaml\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.716008 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpmmd\" (UniqueName: \"kubernetes.io/projected/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-kube-api-access-mpmmd\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.716119 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-config-data\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.716176 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-combined-ca-bundle\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.721560 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.722498 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.723507 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-kube-api-access-mpmmd" (OuterVolumeSpecName: "kube-api-access-mpmmd") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "kube-api-access-mpmmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.724152 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-config-data" (OuterVolumeSpecName: "config-data") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817191 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-log-httpd\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817275 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-run-httpd\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817300 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-scripts\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817323 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-ceilometer-tls-certs\") pod \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\" (UID: \"ef184b0c-47d3-407a-bcae-e8ec605e9cdd\") " Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817593 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817611 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817680 4830 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817692 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpmmd\" (UniqueName: \"kubernetes.io/projected/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-kube-api-access-mpmmd\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817702 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.817711 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.821108 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.823198 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-scripts" (OuterVolumeSpecName: "scripts") pod "ef184b0c-47d3-407a-bcae-e8ec605e9cdd" (UID: "ef184b0c-47d3-407a-bcae-e8ec605e9cdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.919668 4830 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.919702 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.919712 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.919721 4830 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef184b0c-47d3-407a-bcae-e8ec605e9cdd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:40 crc kubenswrapper[4830]: I0311 09:37:40.943380 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193c55e6-58a4-4b42-9692-cf7d6f555333" path="/var/lib/kubelet/pods/193c55e6-58a4-4b42-9692-cf7d6f555333/volumes" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.669355 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.735527 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.743571 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.751775 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.753847 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.758363 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.758749 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.758967 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.767870 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837262 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a97be49-616f-4338-b04a-9928016b4c26-run-httpd\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837531 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837616 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-config-data\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837663 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn5fx\" (UniqueName: \"kubernetes.io/projected/9a97be49-616f-4338-b04a-9928016b4c26-kube-api-access-dn5fx\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837743 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837795 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-scripts\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.837869 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a97be49-616f-4338-b04a-9928016b4c26-log-httpd\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a97be49-616f-4338-b04a-9928016b4c26-run-httpd\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940391 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940419 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-config-data\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940441 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn5fx\" (UniqueName: \"kubernetes.io/projected/9a97be49-616f-4338-b04a-9928016b4c26-kube-api-access-dn5fx\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940603 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940645 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-scripts\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940782 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a97be49-616f-4338-b04a-9928016b4c26-log-httpd\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.940906 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.941101 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a97be49-616f-4338-b04a-9928016b4c26-run-httpd\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.942319 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a97be49-616f-4338-b04a-9928016b4c26-log-httpd\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.953412 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.954280 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-scripts\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.954497 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-config-data\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.955151 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.955758 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a97be49-616f-4338-b04a-9928016b4c26-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:41 crc kubenswrapper[4830]: I0311 09:37:41.956206 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn5fx\" (UniqueName: \"kubernetes.io/projected/9a97be49-616f-4338-b04a-9928016b4c26-kube-api-access-dn5fx\") pod \"ceilometer-0\" (UID: \"9a97be49-616f-4338-b04a-9928016b4c26\") " pod="openstack/ceilometer-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.119498 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.244324 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.348474 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-combined-ca-bundle\") pod \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.348630 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-config-data\") pod \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.348696 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085b7222-f42d-47e7-bcdb-f3e41e3333b4-logs\") pod \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.348721 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-847pt\" (UniqueName: \"kubernetes.io/projected/085b7222-f42d-47e7-bcdb-f3e41e3333b4-kube-api-access-847pt\") pod \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\" (UID: \"085b7222-f42d-47e7-bcdb-f3e41e3333b4\") " Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.349213 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085b7222-f42d-47e7-bcdb-f3e41e3333b4-logs" (OuterVolumeSpecName: "logs") pod "085b7222-f42d-47e7-bcdb-f3e41e3333b4" (UID: "085b7222-f42d-47e7-bcdb-f3e41e3333b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.355729 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085b7222-f42d-47e7-bcdb-f3e41e3333b4-kube-api-access-847pt" (OuterVolumeSpecName: "kube-api-access-847pt") pod "085b7222-f42d-47e7-bcdb-f3e41e3333b4" (UID: "085b7222-f42d-47e7-bcdb-f3e41e3333b4"). InnerVolumeSpecName "kube-api-access-847pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.383135 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-config-data" (OuterVolumeSpecName: "config-data") pod "085b7222-f42d-47e7-bcdb-f3e41e3333b4" (UID: "085b7222-f42d-47e7-bcdb-f3e41e3333b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.403183 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "085b7222-f42d-47e7-bcdb-f3e41e3333b4" (UID: "085b7222-f42d-47e7-bcdb-f3e41e3333b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.454951 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.454987 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/085b7222-f42d-47e7-bcdb-f3e41e3333b4-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.454997 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-847pt\" (UniqueName: \"kubernetes.io/projected/085b7222-f42d-47e7-bcdb-f3e41e3333b4-kube-api-access-847pt\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.455009 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085b7222-f42d-47e7-bcdb-f3e41e3333b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.675886 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 11 09:37:42 crc kubenswrapper[4830]: W0311 09:37:42.679601 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a97be49_616f_4338_b04a_9928016b4c26.slice/crio-ea8754682049b679d3824cfcf3cb9849b6f71580c6431daf250889be74a03fbe WatchSource:0}: Error finding container ea8754682049b679d3824cfcf3cb9849b6f71580c6431daf250889be74a03fbe: Status 404 returned error can't find the container with id ea8754682049b679d3824cfcf3cb9849b6f71580c6431daf250889be74a03fbe Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.681775 4830 generic.go:334] "Generic (PLEG): container finished" podID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerID="a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724" exitCode=0 Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.681819 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"085b7222-f42d-47e7-bcdb-f3e41e3333b4","Type":"ContainerDied","Data":"a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724"} Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.681848 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"085b7222-f42d-47e7-bcdb-f3e41e3333b4","Type":"ContainerDied","Data":"787727fafc980463e56bbaaff4f3c244b9f3f27e046c6112b139fda6de5ef241"} Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.681868 4830 scope.go:117] "RemoveContainer" containerID="a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.682029 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.725404 4830 scope.go:117] "RemoveContainer" containerID="a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.739075 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.749966 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.750696 4830 scope.go:117] "RemoveContainer" containerID="a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724" Mar 11 09:37:42 crc kubenswrapper[4830]: E0311 09:37:42.751069 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724\": container with ID starting with a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724 not found: ID does not exist" containerID="a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.751104 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724"} err="failed to get container status \"a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724\": rpc error: code = NotFound desc = could not find container \"a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724\": container with ID starting with a3085ffbf0b014e81b3b01bb4e546772628f7c42a910f21ff3dc298834ee8724 not found: ID does not exist" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.751133 4830 scope.go:117] "RemoveContainer" containerID="a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14" Mar 11 09:37:42 crc kubenswrapper[4830]: E0311 09:37:42.751417 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14\": container with ID starting with a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14 not found: ID does not exist" containerID="a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.751454 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14"} err="failed to get container status \"a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14\": rpc error: code = NotFound desc = could not find container \"a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14\": container with ID starting with a791453f2a689e92dcef0b22db87dd618ef37b40bc5456fa8f48aa45361f1f14 not found: ID does not exist" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.758773 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:42 crc kubenswrapper[4830]: E0311 09:37:42.759227 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-log" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.759245 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-log" Mar 11 09:37:42 crc kubenswrapper[4830]: E0311 09:37:42.759259 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-api" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.759266 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-api" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.759467 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-log" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.759498 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" containerName="nova-api-api" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.761414 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.764907 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.765244 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.765379 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.768750 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.949544 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085b7222-f42d-47e7-bcdb-f3e41e3333b4" path="/var/lib/kubelet/pods/085b7222-f42d-47e7-bcdb-f3e41e3333b4/volumes" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.950321 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef184b0c-47d3-407a-bcae-e8ec605e9cdd" path="/var/lib/kubelet/pods/ef184b0c-47d3-407a-bcae-e8ec605e9cdd/volumes" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.950751 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.964342 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bd4s\" (UniqueName: \"kubernetes.io/projected/c653d82f-7385-4ed9-b132-29d30f118f98-kube-api-access-6bd4s\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.964407 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.964425 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-public-tls-certs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.964653 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-config-data\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.964718 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.964794 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c653d82f-7385-4ed9-b132-29d30f118f98-logs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:42 crc kubenswrapper[4830]: I0311 09:37:42.970763 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.066916 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c653d82f-7385-4ed9-b132-29d30f118f98-logs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.067085 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bd4s\" (UniqueName: \"kubernetes.io/projected/c653d82f-7385-4ed9-b132-29d30f118f98-kube-api-access-6bd4s\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.067141 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.067163 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-public-tls-certs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.067247 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-config-data\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.067338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.068175 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c653d82f-7385-4ed9-b132-29d30f118f98-logs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.073605 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-public-tls-certs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.073707 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.074055 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.076654 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-config-data\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.083395 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bd4s\" (UniqueName: \"kubernetes.io/projected/c653d82f-7385-4ed9-b132-29d30f118f98-kube-api-access-6bd4s\") pod \"nova-api-0\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.382517 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.698729 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a97be49-616f-4338-b04a-9928016b4c26","Type":"ContainerStarted","Data":"dc0768f126513e61045cbfcdb5fdce2dc4ac7a5d7f08310f66d3f403e9d911fb"} Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.698977 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a97be49-616f-4338-b04a-9928016b4c26","Type":"ContainerStarted","Data":"ea8754682049b679d3824cfcf3cb9849b6f71580c6431daf250889be74a03fbe"} Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.723076 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 11 09:37:43 crc kubenswrapper[4830]: I0311 09:37:43.898041 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.045418 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7p4r5"] Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.046562 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.047901 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.051462 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.062762 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7p4r5"] Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.198172 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-config-data\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.198476 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.198512 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-scripts\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.198530 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzz8n\" (UniqueName: \"kubernetes.io/projected/76673d7a-d07c-4dd4-804b-c18820921185-kube-api-access-zzz8n\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.300320 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-config-data\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.300373 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.300422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-scripts\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.300448 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzz8n\" (UniqueName: \"kubernetes.io/projected/76673d7a-d07c-4dd4-804b-c18820921185-kube-api-access-zzz8n\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.313739 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-scripts\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.324991 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-config-data\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.326685 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.345541 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzz8n\" (UniqueName: \"kubernetes.io/projected/76673d7a-d07c-4dd4-804b-c18820921185-kube-api-access-zzz8n\") pod \"nova-cell1-cell-mapping-7p4r5\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.510230 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.710080 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a97be49-616f-4338-b04a-9928016b4c26","Type":"ContainerStarted","Data":"4b26a7d2c7b4a29f80d537aec2eff27a9a7327872dd1ba14bfa4298d49eecd2a"} Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.712308 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c653d82f-7385-4ed9-b132-29d30f118f98","Type":"ContainerStarted","Data":"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0"} Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.712336 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c653d82f-7385-4ed9-b132-29d30f118f98","Type":"ContainerStarted","Data":"08d3218f0c04d13f6f06c14e1f96f9cddbcf228ae95f6035277b9d424408fc0f"} Mar 11 09:37:44 crc kubenswrapper[4830]: I0311 09:37:44.945988 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7p4r5"] Mar 11 09:37:44 crc kubenswrapper[4830]: W0311 09:37:44.947846 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76673d7a_d07c_4dd4_804b_c18820921185.slice/crio-bb3970a11d3b934fc6ec4c4273e14567643b7d343bcca4435da30ce0997cd65a WatchSource:0}: Error finding container bb3970a11d3b934fc6ec4c4273e14567643b7d343bcca4435da30ce0997cd65a: Status 404 returned error can't find the container with id bb3970a11d3b934fc6ec4c4273e14567643b7d343bcca4435da30ce0997cd65a Mar 11 09:37:45 crc kubenswrapper[4830]: I0311 09:37:45.724098 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c653d82f-7385-4ed9-b132-29d30f118f98","Type":"ContainerStarted","Data":"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1"} Mar 11 09:37:45 crc kubenswrapper[4830]: I0311 09:37:45.728927 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a97be49-616f-4338-b04a-9928016b4c26","Type":"ContainerStarted","Data":"41befec71857d3a84a5c6201671bac8387b91e3cec5629886f6abc4dabad63ae"} Mar 11 09:37:45 crc kubenswrapper[4830]: I0311 09:37:45.733709 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7p4r5" event={"ID":"76673d7a-d07c-4dd4-804b-c18820921185","Type":"ContainerStarted","Data":"f9eedf5d32bbdc2d41dcdd433c5ff6b20b9a5419f104c5b6dc02d8c361f10797"} Mar 11 09:37:45 crc kubenswrapper[4830]: I0311 09:37:45.733935 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7p4r5" event={"ID":"76673d7a-d07c-4dd4-804b-c18820921185","Type":"ContainerStarted","Data":"bb3970a11d3b934fc6ec4c4273e14567643b7d343bcca4435da30ce0997cd65a"} Mar 11 09:37:45 crc kubenswrapper[4830]: I0311 09:37:45.752914 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.7528927469999998 podStartE2EDuration="3.752892747s" podCreationTimestamp="2026-03-11 09:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:45.746899102 +0000 UTC m=+1433.528049861" watchObservedRunningTime="2026-03-11 09:37:45.752892747 +0000 UTC m=+1433.534043436" Mar 11 09:37:45 crc kubenswrapper[4830]: I0311 09:37:45.770943 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7p4r5" podStartSLOduration=1.770925215 podStartE2EDuration="1.770925215s" podCreationTimestamp="2026-03-11 09:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:45.764798216 +0000 UTC m=+1433.545948905" watchObservedRunningTime="2026-03-11 09:37:45.770925215 +0000 UTC m=+1433.552075904" Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.118203 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.180813 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-v79d9"] Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.181116 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" podUID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerName="dnsmasq-dns" containerID="cri-o://acb320b4280fed5ec1050ed40d9719952f4b6f7144d5345114350863442fb9d4" gracePeriod=10 Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.746574 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a97be49-616f-4338-b04a-9928016b4c26","Type":"ContainerStarted","Data":"5f78274ffd10c2eddf7c44b6c790a03cf7482c4d87a5d739bf274102db86529b"} Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.746991 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.748891 4830 generic.go:334] "Generic (PLEG): container finished" podID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerID="acb320b4280fed5ec1050ed40d9719952f4b6f7144d5345114350863442fb9d4" exitCode=0 Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.748964 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" event={"ID":"46c0d3f7-fb22-458f-a74d-a4fb397d3d68","Type":"ContainerDied","Data":"acb320b4280fed5ec1050ed40d9719952f4b6f7144d5345114350863442fb9d4"} Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.749025 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" event={"ID":"46c0d3f7-fb22-458f-a74d-a4fb397d3d68","Type":"ContainerDied","Data":"dff143a38561dfe9da81146945b3600213fd20877f9524eb9b81730cb0a6648f"} Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.749040 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff143a38561dfe9da81146945b3600213fd20877f9524eb9b81730cb0a6648f" Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.773644 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.177013847 podStartE2EDuration="5.773625335s" podCreationTimestamp="2026-03-11 09:37:41 +0000 UTC" firstStartedPulling="2026-03-11 09:37:42.681949121 +0000 UTC m=+1430.463099810" lastFinishedPulling="2026-03-11 09:37:46.278560609 +0000 UTC m=+1434.059711298" observedRunningTime="2026-03-11 09:37:46.769988805 +0000 UTC m=+1434.551139514" watchObservedRunningTime="2026-03-11 09:37:46.773625335 +0000 UTC m=+1434.554776024" Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.797307 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.963935 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-nb\") pod \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.964144 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9m2s\" (UniqueName: \"kubernetes.io/projected/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-kube-api-access-f9m2s\") pod \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.964326 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-swift-storage-0\") pod \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.964372 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-config\") pod \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.964433 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-svc\") pod \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.964464 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-sb\") pod \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\" (UID: \"46c0d3f7-fb22-458f-a74d-a4fb397d3d68\") " Mar 11 09:37:46 crc kubenswrapper[4830]: I0311 09:37:46.988338 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-kube-api-access-f9m2s" (OuterVolumeSpecName: "kube-api-access-f9m2s") pod "46c0d3f7-fb22-458f-a74d-a4fb397d3d68" (UID: "46c0d3f7-fb22-458f-a74d-a4fb397d3d68"). InnerVolumeSpecName "kube-api-access-f9m2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.019867 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "46c0d3f7-fb22-458f-a74d-a4fb397d3d68" (UID: "46c0d3f7-fb22-458f-a74d-a4fb397d3d68"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.022905 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46c0d3f7-fb22-458f-a74d-a4fb397d3d68" (UID: "46c0d3f7-fb22-458f-a74d-a4fb397d3d68"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.024387 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-config" (OuterVolumeSpecName: "config") pod "46c0d3f7-fb22-458f-a74d-a4fb397d3d68" (UID: "46c0d3f7-fb22-458f-a74d-a4fb397d3d68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.033350 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46c0d3f7-fb22-458f-a74d-a4fb397d3d68" (UID: "46c0d3f7-fb22-458f-a74d-a4fb397d3d68"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.049313 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46c0d3f7-fb22-458f-a74d-a4fb397d3d68" (UID: "46c0d3f7-fb22-458f-a74d-a4fb397d3d68"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.066750 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.066958 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.067099 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.067186 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9m2s\" (UniqueName: \"kubernetes.io/projected/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-kube-api-access-f9m2s\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.067273 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.067362 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c0d3f7-fb22-458f-a74d-a4fb397d3d68-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.755827 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-865f5d856f-v79d9" Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.818534 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-v79d9"] Mar 11 09:37:47 crc kubenswrapper[4830]: I0311 09:37:47.837289 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-865f5d856f-v79d9"] Mar 11 09:37:48 crc kubenswrapper[4830]: I0311 09:37:48.944992 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" path="/var/lib/kubelet/pods/46c0d3f7-fb22-458f-a74d-a4fb397d3d68/volumes" Mar 11 09:37:50 crc kubenswrapper[4830]: I0311 09:37:50.780696 4830 generic.go:334] "Generic (PLEG): container finished" podID="76673d7a-d07c-4dd4-804b-c18820921185" containerID="f9eedf5d32bbdc2d41dcdd433c5ff6b20b9a5419f104c5b6dc02d8c361f10797" exitCode=0 Mar 11 09:37:50 crc kubenswrapper[4830]: I0311 09:37:50.780758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7p4r5" event={"ID":"76673d7a-d07c-4dd4-804b-c18820921185","Type":"ContainerDied","Data":"f9eedf5d32bbdc2d41dcdd433c5ff6b20b9a5419f104c5b6dc02d8c361f10797"} Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.182951 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.266794 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-scripts\") pod \"76673d7a-d07c-4dd4-804b-c18820921185\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.267103 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzz8n\" (UniqueName: \"kubernetes.io/projected/76673d7a-d07c-4dd4-804b-c18820921185-kube-api-access-zzz8n\") pod \"76673d7a-d07c-4dd4-804b-c18820921185\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.267135 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-config-data\") pod \"76673d7a-d07c-4dd4-804b-c18820921185\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.267151 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-combined-ca-bundle\") pod \"76673d7a-d07c-4dd4-804b-c18820921185\" (UID: \"76673d7a-d07c-4dd4-804b-c18820921185\") " Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.274370 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76673d7a-d07c-4dd4-804b-c18820921185-kube-api-access-zzz8n" (OuterVolumeSpecName: "kube-api-access-zzz8n") pod "76673d7a-d07c-4dd4-804b-c18820921185" (UID: "76673d7a-d07c-4dd4-804b-c18820921185"). InnerVolumeSpecName "kube-api-access-zzz8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.274398 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-scripts" (OuterVolumeSpecName: "scripts") pod "76673d7a-d07c-4dd4-804b-c18820921185" (UID: "76673d7a-d07c-4dd4-804b-c18820921185"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.296787 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-config-data" (OuterVolumeSpecName: "config-data") pod "76673d7a-d07c-4dd4-804b-c18820921185" (UID: "76673d7a-d07c-4dd4-804b-c18820921185"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.299597 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76673d7a-d07c-4dd4-804b-c18820921185" (UID: "76673d7a-d07c-4dd4-804b-c18820921185"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.369184 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzz8n\" (UniqueName: \"kubernetes.io/projected/76673d7a-d07c-4dd4-804b-c18820921185-kube-api-access-zzz8n\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.369221 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.369232 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.369239 4830 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76673d7a-d07c-4dd4-804b-c18820921185-scripts\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.799693 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7p4r5" event={"ID":"76673d7a-d07c-4dd4-804b-c18820921185","Type":"ContainerDied","Data":"bb3970a11d3b934fc6ec4c4273e14567643b7d343bcca4435da30ce0997cd65a"} Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.799748 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3970a11d3b934fc6ec4c4273e14567643b7d343bcca4435da30ce0997cd65a" Mar 11 09:37:52 crc kubenswrapper[4830]: I0311 09:37:52.799775 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7p4r5" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.003097 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.003394 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-log" containerID="cri-o://8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0" gracePeriod=30 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.003594 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-api" containerID="cri-o://96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1" gracePeriod=30 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.037054 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.037318 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="173258f5-abf0-4ddb-ba63-02eb03db5521" containerName="nova-scheduler-scheduler" containerID="cri-o://8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271" gracePeriod=30 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.054298 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.054554 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-log" containerID="cri-o://fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea" gracePeriod=30 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.054639 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-metadata" containerID="cri-o://055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741" gracePeriod=30 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.643733 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.692769 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-combined-ca-bundle\") pod \"c653d82f-7385-4ed9-b132-29d30f118f98\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.692840 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bd4s\" (UniqueName: \"kubernetes.io/projected/c653d82f-7385-4ed9-b132-29d30f118f98-kube-api-access-6bd4s\") pod \"c653d82f-7385-4ed9-b132-29d30f118f98\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.692926 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c653d82f-7385-4ed9-b132-29d30f118f98-logs\") pod \"c653d82f-7385-4ed9-b132-29d30f118f98\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.692981 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-public-tls-certs\") pod \"c653d82f-7385-4ed9-b132-29d30f118f98\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.693002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-config-data\") pod \"c653d82f-7385-4ed9-b132-29d30f118f98\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.693039 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-internal-tls-certs\") pod \"c653d82f-7385-4ed9-b132-29d30f118f98\" (UID: \"c653d82f-7385-4ed9-b132-29d30f118f98\") " Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.694155 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c653d82f-7385-4ed9-b132-29d30f118f98-logs" (OuterVolumeSpecName: "logs") pod "c653d82f-7385-4ed9-b132-29d30f118f98" (UID: "c653d82f-7385-4ed9-b132-29d30f118f98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.699162 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c653d82f-7385-4ed9-b132-29d30f118f98-kube-api-access-6bd4s" (OuterVolumeSpecName: "kube-api-access-6bd4s") pod "c653d82f-7385-4ed9-b132-29d30f118f98" (UID: "c653d82f-7385-4ed9-b132-29d30f118f98"). InnerVolumeSpecName "kube-api-access-6bd4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.725586 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-config-data" (OuterVolumeSpecName: "config-data") pod "c653d82f-7385-4ed9-b132-29d30f118f98" (UID: "c653d82f-7385-4ed9-b132-29d30f118f98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.730425 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c653d82f-7385-4ed9-b132-29d30f118f98" (UID: "c653d82f-7385-4ed9-b132-29d30f118f98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.751437 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c653d82f-7385-4ed9-b132-29d30f118f98" (UID: "c653d82f-7385-4ed9-b132-29d30f118f98"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.751698 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c653d82f-7385-4ed9-b132-29d30f118f98" (UID: "c653d82f-7385-4ed9-b132-29d30f118f98"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.770624 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.772432 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.773953 4830 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.774004 4830 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="173258f5-abf0-4ddb-ba63-02eb03db5521" containerName="nova-scheduler-scheduler" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.802518 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bd4s\" (UniqueName: \"kubernetes.io/projected/c653d82f-7385-4ed9-b132-29d30f118f98-kube-api-access-6bd4s\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.802577 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c653d82f-7385-4ed9-b132-29d30f118f98-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.802591 4830 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.806543 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.806590 4830 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.806607 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c653d82f-7385-4ed9-b132-29d30f118f98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.817380 4830 generic.go:334] "Generic (PLEG): container finished" podID="81e312c8-3719-4bea-8034-0958a37b831f" containerID="fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea" exitCode=143 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.817458 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81e312c8-3719-4bea-8034-0958a37b831f","Type":"ContainerDied","Data":"fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea"} Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.820582 4830 generic.go:334] "Generic (PLEG): container finished" podID="c653d82f-7385-4ed9-b132-29d30f118f98" containerID="96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1" exitCode=0 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.820617 4830 generic.go:334] "Generic (PLEG): container finished" podID="c653d82f-7385-4ed9-b132-29d30f118f98" containerID="8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0" exitCode=143 Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.820647 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c653d82f-7385-4ed9-b132-29d30f118f98","Type":"ContainerDied","Data":"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1"} Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.820680 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c653d82f-7385-4ed9-b132-29d30f118f98","Type":"ContainerDied","Data":"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0"} Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.820692 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c653d82f-7385-4ed9-b132-29d30f118f98","Type":"ContainerDied","Data":"08d3218f0c04d13f6f06c14e1f96f9cddbcf228ae95f6035277b9d424408fc0f"} Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.820710 4830 scope.go:117] "RemoveContainer" containerID="96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.820891 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.856997 4830 scope.go:117] "RemoveContainer" containerID="8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.872179 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.884204 4830 scope.go:117] "RemoveContainer" containerID="96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.884309 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.885907 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1\": container with ID starting with 96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1 not found: ID does not exist" containerID="96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.885946 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1"} err="failed to get container status \"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1\": rpc error: code = NotFound desc = could not find container \"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1\": container with ID starting with 96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1 not found: ID does not exist" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.885974 4830 scope.go:117] "RemoveContainer" containerID="8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0" Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.887725 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0\": container with ID starting with 8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0 not found: ID does not exist" containerID="8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.887757 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0"} err="failed to get container status \"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0\": rpc error: code = NotFound desc = could not find container \"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0\": container with ID starting with 8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0 not found: ID does not exist" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.887783 4830 scope.go:117] "RemoveContainer" containerID="96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.888115 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1"} err="failed to get container status \"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1\": rpc error: code = NotFound desc = could not find container \"96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1\": container with ID starting with 96537c75867e9e2810e8426b486ef9e81386b615ab224189a3df8b1e2b4e1af1 not found: ID does not exist" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.888139 4830 scope.go:117] "RemoveContainer" containerID="8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.888416 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0"} err="failed to get container status \"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0\": rpc error: code = NotFound desc = could not find container \"8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0\": container with ID starting with 8365fb67988806128400f10496f835d0bb9268586914e33f4a3fdf00fb41e6e0 not found: ID does not exist" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.905960 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.906411 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76673d7a-d07c-4dd4-804b-c18820921185" containerName="nova-manage" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906433 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="76673d7a-d07c-4dd4-804b-c18820921185" containerName="nova-manage" Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.906444 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerName="init" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906451 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerName="init" Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.906465 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerName="dnsmasq-dns" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906474 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerName="dnsmasq-dns" Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.906491 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-log" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906497 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-log" Mar 11 09:37:53 crc kubenswrapper[4830]: E0311 09:37:53.906505 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-api" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906510 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-api" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906667 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-log" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906690 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c0d3f7-fb22-458f-a74d-a4fb397d3d68" containerName="dnsmasq-dns" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906704 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="76673d7a-d07c-4dd4-804b-c18820921185" containerName="nova-manage" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.906717 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" containerName="nova-api-api" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.907676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.910375 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.910550 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.910701 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 11 09:37:53 crc kubenswrapper[4830]: I0311 09:37:53.915459 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.011235 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz8x4\" (UniqueName: \"kubernetes.io/projected/d3eb0127-a012-4cbf-8768-84e20518f316-kube-api-access-gz8x4\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.011297 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.011531 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3eb0127-a012-4cbf-8768-84e20518f316-logs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.011598 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.011882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-public-tls-certs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.011996 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-config-data\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.114188 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3eb0127-a012-4cbf-8768-84e20518f316-logs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.114710 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.114651 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3eb0127-a012-4cbf-8768-84e20518f316-logs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.114920 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-public-tls-certs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.115725 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-config-data\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.115833 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz8x4\" (UniqueName: \"kubernetes.io/projected/d3eb0127-a012-4cbf-8768-84e20518f316-kube-api-access-gz8x4\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.115861 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.121858 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.124184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-config-data\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.124810 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-public-tls-certs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.125311 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3eb0127-a012-4cbf-8768-84e20518f316-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.136143 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz8x4\" (UniqueName: \"kubernetes.io/projected/d3eb0127-a012-4cbf-8768-84e20518f316-kube-api-access-gz8x4\") pod \"nova-api-0\" (UID: \"d3eb0127-a012-4cbf-8768-84e20518f316\") " pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.226995 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.679846 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.838436 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3eb0127-a012-4cbf-8768-84e20518f316","Type":"ContainerStarted","Data":"2d883a1f1b47ccfd036b356821180054c14a58ebd86ca26b3f803c5dd2d54a43"} Mar 11 09:37:54 crc kubenswrapper[4830]: I0311 09:37:54.946599 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c653d82f-7385-4ed9-b132-29d30f118f98" path="/var/lib/kubelet/pods/c653d82f-7385-4ed9-b132-29d30f118f98/volumes" Mar 11 09:37:55 crc kubenswrapper[4830]: I0311 09:37:55.851202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3eb0127-a012-4cbf-8768-84e20518f316","Type":"ContainerStarted","Data":"1d6cc77374e34c80f5a6e4a715d50fb0e9842ca4640f7274458d166af6b1ea68"} Mar 11 09:37:55 crc kubenswrapper[4830]: I0311 09:37:55.851589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3eb0127-a012-4cbf-8768-84e20518f316","Type":"ContainerStarted","Data":"5ff80b5d34dd10770ba1b40d2fd53f23a43a171f95688f2d5c1bd32663c7ba19"} Mar 11 09:37:55 crc kubenswrapper[4830]: I0311 09:37:55.875766 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.875744458 podStartE2EDuration="2.875744458s" podCreationTimestamp="2026-03-11 09:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:55.869427184 +0000 UTC m=+1443.650577923" watchObservedRunningTime="2026-03-11 09:37:55.875744458 +0000 UTC m=+1443.656895147" Mar 11 09:37:56 crc kubenswrapper[4830]: I0311 09:37:56.184369 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:47574->10.217.0.196:8775: read: connection reset by peer" Mar 11 09:37:56 crc kubenswrapper[4830]: I0311 09:37:56.184413 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:47564->10.217.0.196:8775: read: connection reset by peer" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.651314 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.767884 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p76jc\" (UniqueName: \"kubernetes.io/projected/81e312c8-3719-4bea-8034-0958a37b831f-kube-api-access-p76jc\") pod \"81e312c8-3719-4bea-8034-0958a37b831f\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.768811 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-combined-ca-bundle\") pod \"81e312c8-3719-4bea-8034-0958a37b831f\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.768843 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-nova-metadata-tls-certs\") pod \"81e312c8-3719-4bea-8034-0958a37b831f\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.768875 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-config-data\") pod \"81e312c8-3719-4bea-8034-0958a37b831f\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.768918 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e312c8-3719-4bea-8034-0958a37b831f-logs\") pod \"81e312c8-3719-4bea-8034-0958a37b831f\" (UID: \"81e312c8-3719-4bea-8034-0958a37b831f\") " Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.769791 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e312c8-3719-4bea-8034-0958a37b831f-logs" (OuterVolumeSpecName: "logs") pod "81e312c8-3719-4bea-8034-0958a37b831f" (UID: "81e312c8-3719-4bea-8034-0958a37b831f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.779984 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e312c8-3719-4bea-8034-0958a37b831f-kube-api-access-p76jc" (OuterVolumeSpecName: "kube-api-access-p76jc") pod "81e312c8-3719-4bea-8034-0958a37b831f" (UID: "81e312c8-3719-4bea-8034-0958a37b831f"). InnerVolumeSpecName "kube-api-access-p76jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.799260 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-config-data" (OuterVolumeSpecName: "config-data") pod "81e312c8-3719-4bea-8034-0958a37b831f" (UID: "81e312c8-3719-4bea-8034-0958a37b831f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.815187 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81e312c8-3719-4bea-8034-0958a37b831f" (UID: "81e312c8-3719-4bea-8034-0958a37b831f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.861569 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "81e312c8-3719-4bea-8034-0958a37b831f" (UID: "81e312c8-3719-4bea-8034-0958a37b831f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.872475 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.872533 4830 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81e312c8-3719-4bea-8034-0958a37b831f-logs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.872544 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p76jc\" (UniqueName: \"kubernetes.io/projected/81e312c8-3719-4bea-8034-0958a37b831f-kube-api-access-p76jc\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.872558 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.872568 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/81e312c8-3719-4bea-8034-0958a37b831f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.897091 4830 generic.go:334] "Generic (PLEG): container finished" podID="81e312c8-3719-4bea-8034-0958a37b831f" containerID="055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741" exitCode=0 Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.897397 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.898086 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81e312c8-3719-4bea-8034-0958a37b831f","Type":"ContainerDied","Data":"055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741"} Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.898158 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"81e312c8-3719-4bea-8034-0958a37b831f","Type":"ContainerDied","Data":"cefb9660fb012010f3cff2b2be52e164191cee11766b895f06cb602173143e96"} Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.898177 4830 scope.go:117] "RemoveContainer" containerID="055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.929669 4830 scope.go:117] "RemoveContainer" containerID="fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.958510 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.961170 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.966358 4830 scope.go:117] "RemoveContainer" containerID="055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741" Mar 11 09:37:57 crc kubenswrapper[4830]: E0311 09:37:56.972265 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741\": container with ID starting with 055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741 not found: ID does not exist" containerID="055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.972304 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741"} err="failed to get container status \"055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741\": rpc error: code = NotFound desc = could not find container \"055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741\": container with ID starting with 055f19f163a7dbb912a92335c026e246718670c0ef6c51cf24c7c8138fd88741 not found: ID does not exist" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.972333 4830 scope.go:117] "RemoveContainer" containerID="fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea" Mar 11 09:37:57 crc kubenswrapper[4830]: E0311 09:37:56.972775 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea\": container with ID starting with fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea not found: ID does not exist" containerID="fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.972804 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea"} err="failed to get container status \"fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea\": rpc error: code = NotFound desc = could not find container \"fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea\": container with ID starting with fe66c89f1bc2d5c154743f75305c5d8ad17e7cce897263fe5f272d30eecbf2ea not found: ID does not exist" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.999280 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:57 crc kubenswrapper[4830]: E0311 09:37:56.999697 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-log" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.999710 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-log" Mar 11 09:37:57 crc kubenswrapper[4830]: E0311 09:37:56.999723 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-metadata" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.999729 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-metadata" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.999922 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-metadata" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:56.999943 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e312c8-3719-4bea-8034-0958a37b831f" containerName="nova-metadata-log" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.000886 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.011762 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.014483 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.014514 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.076444 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72993026-e5ee-42ee-9381-36ec25d1d1d0-logs\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.076540 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24g2\" (UniqueName: \"kubernetes.io/projected/72993026-e5ee-42ee-9381-36ec25d1d1d0-kube-api-access-v24g2\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.076586 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.076609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-config-data\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.076794 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.178456 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.178498 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72993026-e5ee-42ee-9381-36ec25d1d1d0-logs\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.178600 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24g2\" (UniqueName: \"kubernetes.io/projected/72993026-e5ee-42ee-9381-36ec25d1d1d0-kube-api-access-v24g2\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.178637 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.178660 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-config-data\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.179122 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72993026-e5ee-42ee-9381-36ec25d1d1d0-logs\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.183940 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.184437 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-config-data\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.184998 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/72993026-e5ee-42ee-9381-36ec25d1d1d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.193438 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24g2\" (UniqueName: \"kubernetes.io/projected/72993026-e5ee-42ee-9381-36ec25d1d1d0-kube-api-access-v24g2\") pod \"nova-metadata-0\" (UID: \"72993026-e5ee-42ee-9381-36ec25d1d1d0\") " pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.358748 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.964478 4830 generic.go:334] "Generic (PLEG): container finished" podID="173258f5-abf0-4ddb-ba63-02eb03db5521" containerID="8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271" exitCode=0 Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.964766 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"173258f5-abf0-4ddb-ba63-02eb03db5521","Type":"ContainerDied","Data":"8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271"} Mar 11 09:37:57 crc kubenswrapper[4830]: I0311 09:37:57.986700 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 11 09:37:58 crc kubenswrapper[4830]: W0311 09:37:58.001668 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72993026_e5ee_42ee_9381_36ec25d1d1d0.slice/crio-605934a45b150c7c7256e20d04d8055ad6caf6c89a047c7b751e532f3e65ea41 WatchSource:0}: Error finding container 605934a45b150c7c7256e20d04d8055ad6caf6c89a047c7b751e532f3e65ea41: Status 404 returned error can't find the container with id 605934a45b150c7c7256e20d04d8055ad6caf6c89a047c7b751e532f3e65ea41 Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.075713 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.134729 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-combined-ca-bundle\") pod \"173258f5-abf0-4ddb-ba63-02eb03db5521\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.134863 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-config-data\") pod \"173258f5-abf0-4ddb-ba63-02eb03db5521\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.134894 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fcsg\" (UniqueName: \"kubernetes.io/projected/173258f5-abf0-4ddb-ba63-02eb03db5521-kube-api-access-6fcsg\") pod \"173258f5-abf0-4ddb-ba63-02eb03db5521\" (UID: \"173258f5-abf0-4ddb-ba63-02eb03db5521\") " Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.143200 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173258f5-abf0-4ddb-ba63-02eb03db5521-kube-api-access-6fcsg" (OuterVolumeSpecName: "kube-api-access-6fcsg") pod "173258f5-abf0-4ddb-ba63-02eb03db5521" (UID: "173258f5-abf0-4ddb-ba63-02eb03db5521"). InnerVolumeSpecName "kube-api-access-6fcsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.192546 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "173258f5-abf0-4ddb-ba63-02eb03db5521" (UID: "173258f5-abf0-4ddb-ba63-02eb03db5521"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.195693 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-config-data" (OuterVolumeSpecName: "config-data") pod "173258f5-abf0-4ddb-ba63-02eb03db5521" (UID: "173258f5-abf0-4ddb-ba63-02eb03db5521"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.237335 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.237363 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fcsg\" (UniqueName: \"kubernetes.io/projected/173258f5-abf0-4ddb-ba63-02eb03db5521-kube-api-access-6fcsg\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.237373 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173258f5-abf0-4ddb-ba63-02eb03db5521-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.942714 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e312c8-3719-4bea-8034-0958a37b831f" path="/var/lib/kubelet/pods/81e312c8-3719-4bea-8034-0958a37b831f/volumes" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.975415 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"173258f5-abf0-4ddb-ba63-02eb03db5521","Type":"ContainerDied","Data":"a4a9ebaeedc96798e9f967c2add25ab9c0afc4483124b5728c063491305234e0"} Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.975553 4830 scope.go:117] "RemoveContainer" containerID="8764e41a667c05f357d011d34a1e5c0fc8fd12d4f29e1fd5e3eca5a9ae142271" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.975695 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.978649 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"72993026-e5ee-42ee-9381-36ec25d1d1d0","Type":"ContainerStarted","Data":"7ee6151fbfd129ffcbacba1ddcb836fea64ca1a15acc55fe95232d01ccf26125"} Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.978689 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"72993026-e5ee-42ee-9381-36ec25d1d1d0","Type":"ContainerStarted","Data":"31e9efa0290a44f7613707e26fd5b18bf31f5ab8700aa1a32390015a29287739"} Mar 11 09:37:58 crc kubenswrapper[4830]: I0311 09:37:58.978698 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"72993026-e5ee-42ee-9381-36ec25d1d1d0","Type":"ContainerStarted","Data":"605934a45b150c7c7256e20d04d8055ad6caf6c89a047c7b751e532f3e65ea41"} Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.008114 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.008092619 podStartE2EDuration="3.008092619s" podCreationTimestamp="2026-03-11 09:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:37:58.996771846 +0000 UTC m=+1446.777922555" watchObservedRunningTime="2026-03-11 09:37:59.008092619 +0000 UTC m=+1446.789243308" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.019178 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.027075 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.034921 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:59 crc kubenswrapper[4830]: E0311 09:37:59.035440 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173258f5-abf0-4ddb-ba63-02eb03db5521" containerName="nova-scheduler-scheduler" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.035474 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="173258f5-abf0-4ddb-ba63-02eb03db5521" containerName="nova-scheduler-scheduler" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.035654 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="173258f5-abf0-4ddb-ba63-02eb03db5521" containerName="nova-scheduler-scheduler" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.036267 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.039153 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.054280 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.154058 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-config-data\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.154102 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.154150 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgck\" (UniqueName: \"kubernetes.io/projected/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-kube-api-access-zsgck\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.256434 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-config-data\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.256491 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.256554 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgck\" (UniqueName: \"kubernetes.io/projected/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-kube-api-access-zsgck\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.262674 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-config-data\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.262906 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.279844 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgck\" (UniqueName: \"kubernetes.io/projected/c53770ee-2b8a-4e7a-a59c-bc09739ce4e5-kube-api-access-zsgck\") pod \"nova-scheduler-0\" (UID: \"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5\") " pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.353958 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.829485 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 11 09:37:59 crc kubenswrapper[4830]: I0311 09:37:59.993895 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5","Type":"ContainerStarted","Data":"8a706de96604d9382fe1a18215572c58cb1b59c4254f07e984df82b70d190ecf"} Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.146841 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553698-9mp47"] Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.148546 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553698-9mp47" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.152321 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.155570 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.157099 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.161479 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553698-9mp47"] Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.275401 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74c54\" (UniqueName: \"kubernetes.io/projected/8c94f958-2a99-43d5-bee6-f5e2c9ed9e80-kube-api-access-74c54\") pod \"auto-csr-approver-29553698-9mp47\" (UID: \"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80\") " pod="openshift-infra/auto-csr-approver-29553698-9mp47" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.377732 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74c54\" (UniqueName: \"kubernetes.io/projected/8c94f958-2a99-43d5-bee6-f5e2c9ed9e80-kube-api-access-74c54\") pod \"auto-csr-approver-29553698-9mp47\" (UID: \"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80\") " pod="openshift-infra/auto-csr-approver-29553698-9mp47" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.396591 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74c54\" (UniqueName: \"kubernetes.io/projected/8c94f958-2a99-43d5-bee6-f5e2c9ed9e80-kube-api-access-74c54\") pod \"auto-csr-approver-29553698-9mp47\" (UID: \"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80\") " pod="openshift-infra/auto-csr-approver-29553698-9mp47" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.469823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553698-9mp47" Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.920940 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553698-9mp47"] Mar 11 09:38:00 crc kubenswrapper[4830]: I0311 09:38:00.970876 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173258f5-abf0-4ddb-ba63-02eb03db5521" path="/var/lib/kubelet/pods/173258f5-abf0-4ddb-ba63-02eb03db5521/volumes" Mar 11 09:38:01 crc kubenswrapper[4830]: I0311 09:38:01.004005 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553698-9mp47" event={"ID":"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80","Type":"ContainerStarted","Data":"d9f8409141451c46523d4c608ca4183c010f9ac8403760f4eafe6223d2c336ab"} Mar 11 09:38:01 crc kubenswrapper[4830]: I0311 09:38:01.005754 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c53770ee-2b8a-4e7a-a59c-bc09739ce4e5","Type":"ContainerStarted","Data":"efe89d34fbf8c4eb92ea11008daa724e6aa9fd66126983bfcf59a2f4610d36f0"} Mar 11 09:38:01 crc kubenswrapper[4830]: I0311 09:38:01.027966 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.027946639 podStartE2EDuration="2.027946639s" podCreationTimestamp="2026-03-11 09:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:38:01.025356238 +0000 UTC m=+1448.806506937" watchObservedRunningTime="2026-03-11 09:38:01.027946639 +0000 UTC m=+1448.809097338" Mar 11 09:38:02 crc kubenswrapper[4830]: I0311 09:38:02.359679 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 09:38:02 crc kubenswrapper[4830]: I0311 09:38:02.360087 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 11 09:38:03 crc kubenswrapper[4830]: I0311 09:38:03.027919 4830 generic.go:334] "Generic (PLEG): container finished" podID="8c94f958-2a99-43d5-bee6-f5e2c9ed9e80" containerID="68135fe04ab55fcdbbc689c20b0ea6fe2184e13647d560a19d3c5234c6b72348" exitCode=0 Mar 11 09:38:03 crc kubenswrapper[4830]: I0311 09:38:03.028202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553698-9mp47" event={"ID":"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80","Type":"ContainerDied","Data":"68135fe04ab55fcdbbc689c20b0ea6fe2184e13647d560a19d3c5234c6b72348"} Mar 11 09:38:04 crc kubenswrapper[4830]: I0311 09:38:04.229739 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 09:38:04 crc kubenswrapper[4830]: I0311 09:38:04.231086 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 11 09:38:04 crc kubenswrapper[4830]: I0311 09:38:04.354095 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 11 09:38:04 crc kubenswrapper[4830]: I0311 09:38:04.508565 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553698-9mp47" Mar 11 09:38:04 crc kubenswrapper[4830]: I0311 09:38:04.557045 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74c54\" (UniqueName: \"kubernetes.io/projected/8c94f958-2a99-43d5-bee6-f5e2c9ed9e80-kube-api-access-74c54\") pod \"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80\" (UID: \"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80\") " Mar 11 09:38:04 crc kubenswrapper[4830]: I0311 09:38:04.573010 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c94f958-2a99-43d5-bee6-f5e2c9ed9e80-kube-api-access-74c54" (OuterVolumeSpecName: "kube-api-access-74c54") pod "8c94f958-2a99-43d5-bee6-f5e2c9ed9e80" (UID: "8c94f958-2a99-43d5-bee6-f5e2c9ed9e80"). InnerVolumeSpecName "kube-api-access-74c54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:04 crc kubenswrapper[4830]: I0311 09:38:04.663463 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74c54\" (UniqueName: \"kubernetes.io/projected/8c94f958-2a99-43d5-bee6-f5e2c9ed9e80-kube-api-access-74c54\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:05 crc kubenswrapper[4830]: I0311 09:38:05.047638 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553698-9mp47" event={"ID":"8c94f958-2a99-43d5-bee6-f5e2c9ed9e80","Type":"ContainerDied","Data":"d9f8409141451c46523d4c608ca4183c010f9ac8403760f4eafe6223d2c336ab"} Mar 11 09:38:05 crc kubenswrapper[4830]: I0311 09:38:05.047683 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553698-9mp47" Mar 11 09:38:05 crc kubenswrapper[4830]: I0311 09:38:05.048065 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9f8409141451c46523d4c608ca4183c010f9ac8403760f4eafe6223d2c336ab" Mar 11 09:38:05 crc kubenswrapper[4830]: I0311 09:38:05.241166 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3eb0127-a012-4cbf-8768-84e20518f316" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 09:38:05 crc kubenswrapper[4830]: I0311 09:38:05.241158 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3eb0127-a012-4cbf-8768-84e20518f316" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 09:38:05 crc kubenswrapper[4830]: I0311 09:38:05.583054 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553692-7d4kj"] Mar 11 09:38:05 crc kubenswrapper[4830]: I0311 09:38:05.592643 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553692-7d4kj"] Mar 11 09:38:06 crc kubenswrapper[4830]: I0311 09:38:06.951306 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcf580f9-330c-4fc3-85af-5a92d87a6d79" path="/var/lib/kubelet/pods/dcf580f9-330c-4fc3-85af-5a92d87a6d79/volumes" Mar 11 09:38:07 crc kubenswrapper[4830]: I0311 09:38:07.359601 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 09:38:07 crc kubenswrapper[4830]: I0311 09:38:07.359671 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 11 09:38:07 crc kubenswrapper[4830]: I0311 09:38:07.484282 4830 scope.go:117] "RemoveContainer" containerID="ebdfdf4bf38a9b39b6cf4119f1a1294f6fdbf9ad56e0094b52538ccc2ce2c9b7" Mar 11 09:38:08 crc kubenswrapper[4830]: I0311 09:38:08.372226 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="72993026-e5ee-42ee-9381-36ec25d1d1d0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 09:38:08 crc kubenswrapper[4830]: I0311 09:38:08.372323 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="72993026-e5ee-42ee-9381-36ec25d1d1d0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 11 09:38:09 crc kubenswrapper[4830]: I0311 09:38:09.355164 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 11 09:38:09 crc kubenswrapper[4830]: I0311 09:38:09.384439 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 11 09:38:10 crc kubenswrapper[4830]: I0311 09:38:10.129635 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 11 09:38:12 crc kubenswrapper[4830]: I0311 09:38:12.138819 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 11 09:38:14 crc kubenswrapper[4830]: I0311 09:38:14.242312 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 09:38:14 crc kubenswrapper[4830]: I0311 09:38:14.242975 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 09:38:14 crc kubenswrapper[4830]: I0311 09:38:14.245251 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 11 09:38:14 crc kubenswrapper[4830]: I0311 09:38:14.247890 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 09:38:15 crc kubenswrapper[4830]: I0311 09:38:15.135993 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 11 09:38:15 crc kubenswrapper[4830]: I0311 09:38:15.142252 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 11 09:38:17 crc kubenswrapper[4830]: I0311 09:38:17.415828 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 09:38:17 crc kubenswrapper[4830]: I0311 09:38:17.417818 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 11 09:38:17 crc kubenswrapper[4830]: I0311 09:38:17.425672 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 09:38:18 crc kubenswrapper[4830]: I0311 09:38:18.167614 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 11 09:38:26 crc kubenswrapper[4830]: I0311 09:38:26.242205 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:38:27 crc kubenswrapper[4830]: I0311 09:38:27.264281 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:38:30 crc kubenswrapper[4830]: I0311 09:38:30.673684 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerName="rabbitmq" containerID="cri-o://3220b80cf459f53356fc59464b56078b0b7074afee6c4ffef2dc9092091aa52c" gracePeriod=604796 Mar 11 09:38:31 crc kubenswrapper[4830]: I0311 09:38:31.844644 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerName="rabbitmq" containerID="cri-o://6f1470d751e93a88f827c3665ad0c284f3ff260cab385acdd59b329e7a850524" gracePeriod=604796 Mar 11 09:38:36 crc kubenswrapper[4830]: I0311 09:38:36.876241 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.249826 4830 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.347179 4830 generic.go:334] "Generic (PLEG): container finished" podID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerID="3220b80cf459f53356fc59464b56078b0b7074afee6c4ffef2dc9092091aa52c" exitCode=0 Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.347256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75e77e40-6cb5-47ec-9074-b663b7dba6b4","Type":"ContainerDied","Data":"3220b80cf459f53356fc59464b56078b0b7074afee6c4ffef2dc9092091aa52c"} Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.582657 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.699662 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75e77e40-6cb5-47ec-9074-b663b7dba6b4-pod-info\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.699758 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-erlang-cookie\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.699804 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-plugins\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.699931 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-confd\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.699960 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-plugins-conf\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.700011 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48bw9\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-kube-api-access-48bw9\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.700085 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-tls\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.700132 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-server-conf\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.700194 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-config-data\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.700227 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.700304 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75e77e40-6cb5-47ec-9074-b663b7dba6b4-erlang-cookie-secret\") pod \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\" (UID: \"75e77e40-6cb5-47ec-9074-b663b7dba6b4\") " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.702328 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.703423 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.703785 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.706150 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e77e40-6cb5-47ec-9074-b663b7dba6b4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.708217 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/75e77e40-6cb5-47ec-9074-b663b7dba6b4-pod-info" (OuterVolumeSpecName: "pod-info") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.708241 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-kube-api-access-48bw9" (OuterVolumeSpecName: "kube-api-access-48bw9") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "kube-api-access-48bw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.710859 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.712349 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.741404 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-config-data" (OuterVolumeSpecName: "config-data") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.765013 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-server-conf" (OuterVolumeSpecName: "server-conf") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803476 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75e77e40-6cb5-47ec-9074-b663b7dba6b4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803512 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75e77e40-6cb5-47ec-9074-b663b7dba6b4-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803525 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803535 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803544 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803553 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48bw9\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-kube-api-access-48bw9\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803560 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803567 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803575 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75e77e40-6cb5-47ec-9074-b663b7dba6b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.803607 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.831303 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "75e77e40-6cb5-47ec-9074-b663b7dba6b4" (UID: "75e77e40-6cb5-47ec-9074-b663b7dba6b4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.833615 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.904997 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75e77e40-6cb5-47ec-9074-b663b7dba6b4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:37 crc kubenswrapper[4830]: I0311 09:38:37.905982 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.365657 4830 generic.go:334] "Generic (PLEG): container finished" podID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerID="6f1470d751e93a88f827c3665ad0c284f3ff260cab385acdd59b329e7a850524" exitCode=0 Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.365903 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3","Type":"ContainerDied","Data":"6f1470d751e93a88f827c3665ad0c284f3ff260cab385acdd59b329e7a850524"} Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.367589 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"75e77e40-6cb5-47ec-9074-b663b7dba6b4","Type":"ContainerDied","Data":"5e5abd159913c6427c0f7c31e7151bf5e7593ea7c11a507f166f7d4ddbc2320a"} Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.367630 4830 scope.go:117] "RemoveContainer" containerID="3220b80cf459f53356fc59464b56078b0b7074afee6c4ffef2dc9092091aa52c" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.367751 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.425116 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.446296 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.453227 4830 scope.go:117] "RemoveContainer" containerID="14d9a01991262c020d733b7284b2b073581e01cc8c4c54b5086cc048dede37f5" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.463587 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:38:38 crc kubenswrapper[4830]: E0311 09:38:38.464021 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c94f958-2a99-43d5-bee6-f5e2c9ed9e80" containerName="oc" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.464057 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c94f958-2a99-43d5-bee6-f5e2c9ed9e80" containerName="oc" Mar 11 09:38:38 crc kubenswrapper[4830]: E0311 09:38:38.464078 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerName="setup-container" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.464085 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerName="setup-container" Mar 11 09:38:38 crc kubenswrapper[4830]: E0311 09:38:38.464108 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerName="rabbitmq" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.464114 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerName="rabbitmq" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.464292 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c94f958-2a99-43d5-bee6-f5e2c9ed9e80" containerName="oc" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.464317 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" containerName="rabbitmq" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.470002 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.473844 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.474244 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.474342 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mrvfm" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.476439 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.476531 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.476611 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.476763 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.481760 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624135 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624209 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624410 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624460 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624540 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0f47113-88e8-4b57-b9df-1ff8b05cde01-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624619 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624709 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624765 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624803 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624882 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0f47113-88e8-4b57-b9df-1ff8b05cde01-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.624972 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhgh\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-kube-api-access-hnhgh\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.724502 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726490 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726533 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726557 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726573 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726622 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0f47113-88e8-4b57-b9df-1ff8b05cde01-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726654 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhgh\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-kube-api-access-hnhgh\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726691 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726726 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726776 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726792 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726821 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0f47113-88e8-4b57-b9df-1ff8b05cde01-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.727059 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.726781 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.727308 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.727649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-config-data\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.727862 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.728980 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e0f47113-88e8-4b57-b9df-1ff8b05cde01-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.732179 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e0f47113-88e8-4b57-b9df-1ff8b05cde01-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.733098 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.734560 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.745625 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e0f47113-88e8-4b57-b9df-1ff8b05cde01-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.755192 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhgh\" (UniqueName: \"kubernetes.io/projected/e0f47113-88e8-4b57-b9df-1ff8b05cde01-kube-api-access-hnhgh\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.776666 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"e0f47113-88e8-4b57-b9df-1ff8b05cde01\") " pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828601 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-erlang-cookie-secret\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828653 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbmbg\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-kube-api-access-jbmbg\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828717 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-confd\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828749 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-erlang-cookie\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828786 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-server-conf\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828814 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-pod-info\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828858 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-plugins\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828905 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-tls\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828943 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-config-data\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.828980 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-plugins-conf\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.829002 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\" (UID: \"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3\") " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.829355 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.829858 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.830836 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.831286 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.833251 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-kube-api-access-jbmbg" (OuterVolumeSpecName: "kube-api-access-jbmbg") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "kube-api-access-jbmbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.833528 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.835193 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-pod-info" (OuterVolumeSpecName: "pod-info") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.837560 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.841348 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.858097 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.891539 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-config-data" (OuterVolumeSpecName: "config-data") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.919095 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-server-conf" (OuterVolumeSpecName: "server-conf") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933228 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933276 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933288 4830 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933316 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933330 4830 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933340 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbmbg\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-kube-api-access-jbmbg\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933350 4830 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-server-conf\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933358 4830 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-pod-info\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.933369 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.954374 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e77e40-6cb5-47ec-9074-b663b7dba6b4" path="/var/lib/kubelet/pods/75e77e40-6cb5-47ec-9074-b663b7dba6b4/volumes" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.969844 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 11 09:38:38 crc kubenswrapper[4830]: I0311 09:38:38.979306 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" (UID: "bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.036488 4830 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.036526 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.390304 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.390640 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3","Type":"ContainerDied","Data":"ec34ddf85b926f765458072737642be5ae9e7474b8b74e5cd15ba11843142d80"} Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.390662 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.390683 4830 scope.go:117] "RemoveContainer" containerID="6f1470d751e93a88f827c3665ad0c284f3ff260cab385acdd59b329e7a850524" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.530300 4830 scope.go:117] "RemoveContainer" containerID="d0f891f9ee5111ec4ec41fb1f690e427848198d851ed49b68aea9751db762add" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.550698 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.560406 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.579748 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:38:39 crc kubenswrapper[4830]: E0311 09:38:39.580368 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerName="rabbitmq" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.580469 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerName="rabbitmq" Mar 11 09:38:39 crc kubenswrapper[4830]: E0311 09:38:39.580529 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerName="setup-container" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.580595 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerName="setup-container" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.580831 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" containerName="rabbitmq" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.582461 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.584601 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.584622 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.584988 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.585046 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.586120 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8pr4h" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.586354 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.587174 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.601808 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750141 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750221 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750257 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5da20462-be2b-466c-9c04-17b6a0a94572-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750292 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frfmr\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-kube-api-access-frfmr\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750317 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5da20462-be2b-466c-9c04-17b6a0a94572-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750343 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750373 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750694 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750762 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.750791 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.852986 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853070 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853093 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5da20462-be2b-466c-9c04-17b6a0a94572-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853120 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frfmr\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-kube-api-access-frfmr\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853141 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5da20462-be2b-466c-9c04-17b6a0a94572-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853169 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853186 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853234 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853261 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853287 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.853304 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.854114 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.854394 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.854594 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.854757 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.854938 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5da20462-be2b-466c-9c04-17b6a0a94572-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.855043 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.859112 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5da20462-be2b-466c-9c04-17b6a0a94572-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.859119 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.859471 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.868821 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5da20462-be2b-466c-9c04-17b6a0a94572-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.869649 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frfmr\" (UniqueName: \"kubernetes.io/projected/5da20462-be2b-466c-9c04-17b6a0a94572-kube-api-access-frfmr\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.882752 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5da20462-be2b-466c-9c04-17b6a0a94572\") " pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:39 crc kubenswrapper[4830]: I0311 09:38:39.941830 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:38:40 crc kubenswrapper[4830]: I0311 09:38:40.203134 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 11 09:38:40 crc kubenswrapper[4830]: I0311 09:38:40.402084 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5da20462-be2b-466c-9c04-17b6a0a94572","Type":"ContainerStarted","Data":"c4c36dc0bcfd28d1f7ded417bbe58a4f6c4057591a5b8dc2a28cb39487e11c3f"} Mar 11 09:38:40 crc kubenswrapper[4830]: I0311 09:38:40.403403 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0f47113-88e8-4b57-b9df-1ff8b05cde01","Type":"ContainerStarted","Data":"c2de820c6c2be04e45c6aae91bb47a49888a2d9ea8d42b36af3f1a6b46787953"} Mar 11 09:38:40 crc kubenswrapper[4830]: I0311 09:38:40.943308 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3" path="/var/lib/kubelet/pods/bf04bb60-c9f6-42a1-bd5b-1ae32d3de4c3/volumes" Mar 11 09:38:41 crc kubenswrapper[4830]: I0311 09:38:41.432101 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0f47113-88e8-4b57-b9df-1ff8b05cde01","Type":"ContainerStarted","Data":"8d9dd3545de9909bd8d60af5b82feb8f8a6636f52a08f1950de7be43c3790656"} Mar 11 09:38:42 crc kubenswrapper[4830]: I0311 09:38:42.442705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5da20462-be2b-466c-9c04-17b6a0a94572","Type":"ContainerStarted","Data":"f7daaa01d6478d78935125c92c75c45a4da3b4a7f49541065451c0452bfd5dfb"} Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.426516 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-2p4lr"] Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.428287 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.431482 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.453392 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-2p4lr"] Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.550399 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-svc\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.550767 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.550905 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-config\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.550942 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.551230 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.551273 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.551326 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7nz\" (UniqueName: \"kubernetes.io/projected/1d66fb79-aef9-4448-a9cd-310e12aded4e-kube-api-access-bf7nz\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.652462 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-config\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.652551 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.652620 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.652639 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.652664 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7nz\" (UniqueName: \"kubernetes.io/projected/1d66fb79-aef9-4448-a9cd-310e12aded4e-kube-api-access-bf7nz\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.653674 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-openstack-edpm-ipam\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.653726 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-svc\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.653809 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-config\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.653853 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-swift-storage-0\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.653868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-sb\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.654292 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-svc\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.654399 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.654993 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-nb\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.696695 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7nz\" (UniqueName: \"kubernetes.io/projected/1d66fb79-aef9-4448-a9cd-310e12aded4e-kube-api-access-bf7nz\") pod \"dnsmasq-dns-5576978c7c-2p4lr\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:44 crc kubenswrapper[4830]: I0311 09:38:44.767676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:45 crc kubenswrapper[4830]: I0311 09:38:45.269279 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-2p4lr"] Mar 11 09:38:45 crc kubenswrapper[4830]: I0311 09:38:45.471588 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" event={"ID":"1d66fb79-aef9-4448-a9cd-310e12aded4e","Type":"ContainerStarted","Data":"97d484b0a0cf7d21a74b5739488b17f2b0ec8c00eca8c420aa6e678030473016"} Mar 11 09:38:46 crc kubenswrapper[4830]: I0311 09:38:46.481960 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerID="a73b0e91907c23e1fa66a154f1dfb8f77ce75e475a46277361e2b95b9bdcdc73" exitCode=0 Mar 11 09:38:46 crc kubenswrapper[4830]: I0311 09:38:46.482059 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" event={"ID":"1d66fb79-aef9-4448-a9cd-310e12aded4e","Type":"ContainerDied","Data":"a73b0e91907c23e1fa66a154f1dfb8f77ce75e475a46277361e2b95b9bdcdc73"} Mar 11 09:38:47 crc kubenswrapper[4830]: I0311 09:38:47.492151 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" event={"ID":"1d66fb79-aef9-4448-a9cd-310e12aded4e","Type":"ContainerStarted","Data":"a4e403c50222ca60982f7753d83a82fb6d99bdf1855e695e8e3698d422878e77"} Mar 11 09:38:47 crc kubenswrapper[4830]: I0311 09:38:47.492604 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:54 crc kubenswrapper[4830]: I0311 09:38:54.769375 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:38:54 crc kubenswrapper[4830]: I0311 09:38:54.805625 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" podStartSLOduration=10.805606269 podStartE2EDuration="10.805606269s" podCreationTimestamp="2026-03-11 09:38:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:38:47.516494102 +0000 UTC m=+1495.297644811" watchObservedRunningTime="2026-03-11 09:38:54.805606269 +0000 UTC m=+1502.586756958" Mar 11 09:38:54 crc kubenswrapper[4830]: I0311 09:38:54.864092 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-dfj4r"] Mar 11 09:38:54 crc kubenswrapper[4830]: I0311 09:38:54.864744 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" podUID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerName="dnsmasq-dns" containerID="cri-o://83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d" gracePeriod=10 Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.177179 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-cms8z"] Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.179159 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.188417 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-cms8z"] Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.302842 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.303013 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.303066 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.303131 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.303195 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-config\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.303303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.303365 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkdmh\" (UniqueName: \"kubernetes.io/projected/e6b1a549-c16a-4efe-83df-800de8dbdac2-kube-api-access-pkdmh\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.404877 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkdmh\" (UniqueName: \"kubernetes.io/projected/e6b1a549-c16a-4efe-83df-800de8dbdac2-kube-api-access-pkdmh\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.404933 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.404981 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.405010 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.405057 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.405109 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-config\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.405174 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.406119 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-ovsdbserver-nb\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.406142 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-openstack-edpm-ipam\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.406409 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-dns-swift-storage-0\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.406458 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-config\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.406722 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-ovsdbserver-sb\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.409601 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b1a549-c16a-4efe-83df-800de8dbdac2-dns-svc\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.429951 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkdmh\" (UniqueName: \"kubernetes.io/projected/e6b1a549-c16a-4efe-83df-800de8dbdac2-kube-api-access-pkdmh\") pod \"dnsmasq-dns-8c6f6df99-cms8z\" (UID: \"e6b1a549-c16a-4efe-83df-800de8dbdac2\") " pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.515004 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.543903 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.597777 4830 generic.go:334] "Generic (PLEG): container finished" podID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerID="83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d" exitCode=0 Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.598220 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" event={"ID":"6edd0f6a-66c6-491e-9db5-0c9f617709d5","Type":"ContainerDied","Data":"83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d"} Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.598254 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" event={"ID":"6edd0f6a-66c6-491e-9db5-0c9f617709d5","Type":"ContainerDied","Data":"f0da355977a72a9054aed548ebff2bc856c3c41a10574120d37f0748ae2c4282"} Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.598278 4830 scope.go:117] "RemoveContainer" containerID="83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.598482 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6c5df9-dfj4r" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.673327 4830 scope.go:117] "RemoveContainer" containerID="0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.710623 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-svc\") pod \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.710799 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-nb\") pod \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.710877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-swift-storage-0\") pod \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.710922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-sb\") pod \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.710972 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crkm5\" (UniqueName: \"kubernetes.io/projected/6edd0f6a-66c6-491e-9db5-0c9f617709d5-kube-api-access-crkm5\") pod \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.711056 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-config\") pod \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\" (UID: \"6edd0f6a-66c6-491e-9db5-0c9f617709d5\") " Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.718473 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edd0f6a-66c6-491e-9db5-0c9f617709d5-kube-api-access-crkm5" (OuterVolumeSpecName: "kube-api-access-crkm5") pod "6edd0f6a-66c6-491e-9db5-0c9f617709d5" (UID: "6edd0f6a-66c6-491e-9db5-0c9f617709d5"). InnerVolumeSpecName "kube-api-access-crkm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.721037 4830 scope.go:117] "RemoveContainer" containerID="83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d" Mar 11 09:38:55 crc kubenswrapper[4830]: E0311 09:38:55.724178 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d\": container with ID starting with 83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d not found: ID does not exist" containerID="83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.724219 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d"} err="failed to get container status \"83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d\": rpc error: code = NotFound desc = could not find container \"83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d\": container with ID starting with 83dcca1fed5cff3f5e304ef84aff26c0930662059ff60e184a87d8b85b257c1d not found: ID does not exist" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.724247 4830 scope.go:117] "RemoveContainer" containerID="0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b" Mar 11 09:38:55 crc kubenswrapper[4830]: E0311 09:38:55.728139 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b\": container with ID starting with 0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b not found: ID does not exist" containerID="0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.728179 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b"} err="failed to get container status \"0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b\": rpc error: code = NotFound desc = could not find container \"0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b\": container with ID starting with 0eb98dc3c7bc3b653d8a53ef29e931fe1027321757711ff2d096426f25e5cc7b not found: ID does not exist" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.782998 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6edd0f6a-66c6-491e-9db5-0c9f617709d5" (UID: "6edd0f6a-66c6-491e-9db5-0c9f617709d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.784006 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6edd0f6a-66c6-491e-9db5-0c9f617709d5" (UID: "6edd0f6a-66c6-491e-9db5-0c9f617709d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.789213 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6edd0f6a-66c6-491e-9db5-0c9f617709d5" (UID: "6edd0f6a-66c6-491e-9db5-0c9f617709d5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.796829 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6edd0f6a-66c6-491e-9db5-0c9f617709d5" (UID: "6edd0f6a-66c6-491e-9db5-0c9f617709d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.816013 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.816096 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.816108 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.816120 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crkm5\" (UniqueName: \"kubernetes.io/projected/6edd0f6a-66c6-491e-9db5-0c9f617709d5-kube-api-access-crkm5\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.816135 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.851948 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-config" (OuterVolumeSpecName: "config") pod "6edd0f6a-66c6-491e-9db5-0c9f617709d5" (UID: "6edd0f6a-66c6-491e-9db5-0c9f617709d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.918388 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edd0f6a-66c6-491e-9db5-0c9f617709d5-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.940987 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-dfj4r"] Mar 11 09:38:55 crc kubenswrapper[4830]: I0311 09:38:55.949738 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6c5df9-dfj4r"] Mar 11 09:38:56 crc kubenswrapper[4830]: I0311 09:38:56.028327 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c6f6df99-cms8z"] Mar 11 09:38:56 crc kubenswrapper[4830]: W0311 09:38:56.031803 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b1a549_c16a_4efe_83df_800de8dbdac2.slice/crio-911708fbca69b8e73d94899c5f35829c7c3eed10406828a8180035a100b03630 WatchSource:0}: Error finding container 911708fbca69b8e73d94899c5f35829c7c3eed10406828a8180035a100b03630: Status 404 returned error can't find the container with id 911708fbca69b8e73d94899c5f35829c7c3eed10406828a8180035a100b03630 Mar 11 09:38:56 crc kubenswrapper[4830]: I0311 09:38:56.608072 4830 generic.go:334] "Generic (PLEG): container finished" podID="e6b1a549-c16a-4efe-83df-800de8dbdac2" containerID="386473d0427c72d2d2f997fef7afd6358b9c0d5f45a1a3ea306f2a149312c173" exitCode=0 Mar 11 09:38:56 crc kubenswrapper[4830]: I0311 09:38:56.608187 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" event={"ID":"e6b1a549-c16a-4efe-83df-800de8dbdac2","Type":"ContainerDied","Data":"386473d0427c72d2d2f997fef7afd6358b9c0d5f45a1a3ea306f2a149312c173"} Mar 11 09:38:56 crc kubenswrapper[4830]: I0311 09:38:56.608434 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" event={"ID":"e6b1a549-c16a-4efe-83df-800de8dbdac2","Type":"ContainerStarted","Data":"911708fbca69b8e73d94899c5f35829c7c3eed10406828a8180035a100b03630"} Mar 11 09:38:56 crc kubenswrapper[4830]: I0311 09:38:56.943946 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" path="/var/lib/kubelet/pods/6edd0f6a-66c6-491e-9db5-0c9f617709d5/volumes" Mar 11 09:38:57 crc kubenswrapper[4830]: I0311 09:38:57.623481 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" event={"ID":"e6b1a549-c16a-4efe-83df-800de8dbdac2","Type":"ContainerStarted","Data":"860750ecd91ae37b3a551e7974545aa6ed40feceb42d47e4a6ea7783380fb5f7"} Mar 11 09:38:57 crc kubenswrapper[4830]: I0311 09:38:57.623990 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:38:57 crc kubenswrapper[4830]: I0311 09:38:57.655268 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" podStartSLOduration=2.655236392 podStartE2EDuration="2.655236392s" podCreationTimestamp="2026-03-11 09:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:38:57.641228497 +0000 UTC m=+1505.422379196" watchObservedRunningTime="2026-03-11 09:38:57.655236392 +0000 UTC m=+1505.436387111" Mar 11 09:39:05 crc kubenswrapper[4830]: I0311 09:39:05.517265 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8c6f6df99-cms8z" Mar 11 09:39:05 crc kubenswrapper[4830]: I0311 09:39:05.588903 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-2p4lr"] Mar 11 09:39:05 crc kubenswrapper[4830]: I0311 09:39:05.589218 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" podUID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerName="dnsmasq-dns" containerID="cri-o://a4e403c50222ca60982f7753d83a82fb6d99bdf1855e695e8e3698d422878e77" gracePeriod=10 Mar 11 09:39:05 crc kubenswrapper[4830]: I0311 09:39:05.711945 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerID="a4e403c50222ca60982f7753d83a82fb6d99bdf1855e695e8e3698d422878e77" exitCode=0 Mar 11 09:39:05 crc kubenswrapper[4830]: I0311 09:39:05.712284 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" event={"ID":"1d66fb79-aef9-4448-a9cd-310e12aded4e","Type":"ContainerDied","Data":"a4e403c50222ca60982f7753d83a82fb6d99bdf1855e695e8e3698d422878e77"} Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.087787 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.136460 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf7nz\" (UniqueName: \"kubernetes.io/projected/1d66fb79-aef9-4448-a9cd-310e12aded4e-kube-api-access-bf7nz\") pod \"1d66fb79-aef9-4448-a9cd-310e12aded4e\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.136513 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-sb\") pod \"1d66fb79-aef9-4448-a9cd-310e12aded4e\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.136698 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-openstack-edpm-ipam\") pod \"1d66fb79-aef9-4448-a9cd-310e12aded4e\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.136730 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-nb\") pod \"1d66fb79-aef9-4448-a9cd-310e12aded4e\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.136764 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-swift-storage-0\") pod \"1d66fb79-aef9-4448-a9cd-310e12aded4e\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.136795 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-svc\") pod \"1d66fb79-aef9-4448-a9cd-310e12aded4e\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.136814 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-config\") pod \"1d66fb79-aef9-4448-a9cd-310e12aded4e\" (UID: \"1d66fb79-aef9-4448-a9cd-310e12aded4e\") " Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.152657 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d66fb79-aef9-4448-a9cd-310e12aded4e-kube-api-access-bf7nz" (OuterVolumeSpecName: "kube-api-access-bf7nz") pod "1d66fb79-aef9-4448-a9cd-310e12aded4e" (UID: "1d66fb79-aef9-4448-a9cd-310e12aded4e"). InnerVolumeSpecName "kube-api-access-bf7nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.199196 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "1d66fb79-aef9-4448-a9cd-310e12aded4e" (UID: "1d66fb79-aef9-4448-a9cd-310e12aded4e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.205777 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d66fb79-aef9-4448-a9cd-310e12aded4e" (UID: "1d66fb79-aef9-4448-a9cd-310e12aded4e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.208829 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d66fb79-aef9-4448-a9cd-310e12aded4e" (UID: "1d66fb79-aef9-4448-a9cd-310e12aded4e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.217393 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1d66fb79-aef9-4448-a9cd-310e12aded4e" (UID: "1d66fb79-aef9-4448-a9cd-310e12aded4e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.224103 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d66fb79-aef9-4448-a9cd-310e12aded4e" (UID: "1d66fb79-aef9-4448-a9cd-310e12aded4e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.228053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-config" (OuterVolumeSpecName: "config") pod "1d66fb79-aef9-4448-a9cd-310e12aded4e" (UID: "1d66fb79-aef9-4448-a9cd-310e12aded4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.237985 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.238036 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.238047 4830 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.238057 4830 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.238067 4830 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-config\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.238075 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf7nz\" (UniqueName: \"kubernetes.io/projected/1d66fb79-aef9-4448-a9cd-310e12aded4e-kube-api-access-bf7nz\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.238085 4830 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d66fb79-aef9-4448-a9cd-310e12aded4e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.721792 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" event={"ID":"1d66fb79-aef9-4448-a9cd-310e12aded4e","Type":"ContainerDied","Data":"97d484b0a0cf7d21a74b5739488b17f2b0ec8c00eca8c420aa6e678030473016"} Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.721844 4830 scope.go:117] "RemoveContainer" containerID="a4e403c50222ca60982f7753d83a82fb6d99bdf1855e695e8e3698d422878e77" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.721975 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5576978c7c-2p4lr" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.761996 4830 scope.go:117] "RemoveContainer" containerID="a73b0e91907c23e1fa66a154f1dfb8f77ce75e475a46277361e2b95b9bdcdc73" Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.763594 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-2p4lr"] Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.771050 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5576978c7c-2p4lr"] Mar 11 09:39:06 crc kubenswrapper[4830]: I0311 09:39:06.948420 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d66fb79-aef9-4448-a9cd-310e12aded4e" path="/var/lib/kubelet/pods/1d66fb79-aef9-4448-a9cd-310e12aded4e/volumes" Mar 11 09:39:13 crc kubenswrapper[4830]: I0311 09:39:13.062033 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:39:13 crc kubenswrapper[4830]: I0311 09:39:13.062584 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:39:13 crc kubenswrapper[4830]: I0311 09:39:13.784832 4830 generic.go:334] "Generic (PLEG): container finished" podID="5da20462-be2b-466c-9c04-17b6a0a94572" containerID="f7daaa01d6478d78935125c92c75c45a4da3b4a7f49541065451c0452bfd5dfb" exitCode=0 Mar 11 09:39:13 crc kubenswrapper[4830]: I0311 09:39:13.784905 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5da20462-be2b-466c-9c04-17b6a0a94572","Type":"ContainerDied","Data":"f7daaa01d6478d78935125c92c75c45a4da3b4a7f49541065451c0452bfd5dfb"} Mar 11 09:39:13 crc kubenswrapper[4830]: I0311 09:39:13.791605 4830 generic.go:334] "Generic (PLEG): container finished" podID="e0f47113-88e8-4b57-b9df-1ff8b05cde01" containerID="8d9dd3545de9909bd8d60af5b82feb8f8a6636f52a08f1950de7be43c3790656" exitCode=0 Mar 11 09:39:13 crc kubenswrapper[4830]: I0311 09:39:13.791651 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0f47113-88e8-4b57-b9df-1ff8b05cde01","Type":"ContainerDied","Data":"8d9dd3545de9909bd8d60af5b82feb8f8a6636f52a08f1950de7be43c3790656"} Mar 11 09:39:13 crc kubenswrapper[4830]: E0311 09:39:13.852208 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da20462_be2b_466c_9c04_17b6a0a94572.slice/crio-f7daaa01d6478d78935125c92c75c45a4da3b4a7f49541065451c0452bfd5dfb.scope\": RecentStats: unable to find data in memory cache]" Mar 11 09:39:14 crc kubenswrapper[4830]: I0311 09:39:14.800837 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5da20462-be2b-466c-9c04-17b6a0a94572","Type":"ContainerStarted","Data":"39780ea70b4d95914eb521a7e65063d5a91ca79c4dde8346a6c2610dfaea2e68"} Mar 11 09:39:14 crc kubenswrapper[4830]: I0311 09:39:14.802304 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:39:14 crc kubenswrapper[4830]: I0311 09:39:14.805612 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e0f47113-88e8-4b57-b9df-1ff8b05cde01","Type":"ContainerStarted","Data":"156542a0c8dffd2f5eebcb5f46fa480c0c3421e648ed881ce3f451b78498dff9"} Mar 11 09:39:14 crc kubenswrapper[4830]: I0311 09:39:14.807060 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 11 09:39:14 crc kubenswrapper[4830]: I0311 09:39:14.861670 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.861650792 podStartE2EDuration="36.861650792s" podCreationTimestamp="2026-03-11 09:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:39:14.857882908 +0000 UTC m=+1522.639033607" watchObservedRunningTime="2026-03-11 09:39:14.861650792 +0000 UTC m=+1522.642801481" Mar 11 09:39:14 crc kubenswrapper[4830]: I0311 09:39:14.865396 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.865388274 podStartE2EDuration="35.865388274s" podCreationTimestamp="2026-03-11 09:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 09:39:14.837533929 +0000 UTC m=+1522.618684618" watchObservedRunningTime="2026-03-11 09:39:14.865388274 +0000 UTC m=+1522.646538963" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.937078 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs"] Mar 11 09:39:17 crc kubenswrapper[4830]: E0311 09:39:17.937788 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerName="init" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.937804 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerName="init" Mar 11 09:39:17 crc kubenswrapper[4830]: E0311 09:39:17.937837 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerName="init" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.937844 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerName="init" Mar 11 09:39:17 crc kubenswrapper[4830]: E0311 09:39:17.937867 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerName="dnsmasq-dns" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.937873 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerName="dnsmasq-dns" Mar 11 09:39:17 crc kubenswrapper[4830]: E0311 09:39:17.937886 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerName="dnsmasq-dns" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.937891 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerName="dnsmasq-dns" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.938254 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d66fb79-aef9-4448-a9cd-310e12aded4e" containerName="dnsmasq-dns" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.938285 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6edd0f6a-66c6-491e-9db5-0c9f617709d5" containerName="dnsmasq-dns" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.938892 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.942358 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.942768 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.943921 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.948253 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs"] Mar 11 09:39:17 crc kubenswrapper[4830]: I0311 09:39:17.953934 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.074312 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.074401 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whd7g\" (UniqueName: \"kubernetes.io/projected/d57e6a98-80e8-40a0-af5d-56d936e6ab67-kube-api-access-whd7g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.074640 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.074800 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.176829 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.176945 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.177008 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.177099 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whd7g\" (UniqueName: \"kubernetes.io/projected/d57e6a98-80e8-40a0-af5d-56d936e6ab67-kube-api-access-whd7g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.182458 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.183242 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.187518 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.196296 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whd7g\" (UniqueName: \"kubernetes.io/projected/d57e6a98-80e8-40a0-af5d-56d936e6ab67-kube-api-access-whd7g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.259917 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.807645 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs"] Mar 11 09:39:18 crc kubenswrapper[4830]: I0311 09:39:18.842002 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" event={"ID":"d57e6a98-80e8-40a0-af5d-56d936e6ab67","Type":"ContainerStarted","Data":"1ebc6c83e7c735b31868da9dbf2665020a974688e1d91e62a527011ce9059fdc"} Mar 11 09:39:27 crc kubenswrapper[4830]: I0311 09:39:27.930860 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" event={"ID":"d57e6a98-80e8-40a0-af5d-56d936e6ab67","Type":"ContainerStarted","Data":"0445f0e418ec8f6cf0c70a2d8bcc0a41339984dabadaeabbe66c8acce87af910"} Mar 11 09:39:27 crc kubenswrapper[4830]: I0311 09:39:27.951917 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" podStartSLOduration=2.800941028 podStartE2EDuration="10.951888808s" podCreationTimestamp="2026-03-11 09:39:17 +0000 UTC" firstStartedPulling="2026-03-11 09:39:18.818990386 +0000 UTC m=+1526.600141075" lastFinishedPulling="2026-03-11 09:39:26.969938166 +0000 UTC m=+1534.751088855" observedRunningTime="2026-03-11 09:39:27.946145841 +0000 UTC m=+1535.727296540" watchObservedRunningTime="2026-03-11 09:39:27.951888808 +0000 UTC m=+1535.733039517" Mar 11 09:39:28 crc kubenswrapper[4830]: I0311 09:39:28.841201 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 11 09:39:29 crc kubenswrapper[4830]: I0311 09:39:29.926684 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpclw"] Mar 11 09:39:29 crc kubenswrapper[4830]: I0311 09:39:29.934240 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:29 crc kubenswrapper[4830]: I0311 09:39:29.945239 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 11 09:39:29 crc kubenswrapper[4830]: I0311 09:39:29.965510 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpclw"] Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.016237 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-catalog-content\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.016313 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-utilities\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.016385 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7krzj\" (UniqueName: \"kubernetes.io/projected/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-kube-api-access-7krzj\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.118927 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-catalog-content\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.118993 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-utilities\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.119080 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7krzj\" (UniqueName: \"kubernetes.io/projected/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-kube-api-access-7krzj\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.119516 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-catalog-content\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.119645 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-utilities\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.137293 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7krzj\" (UniqueName: \"kubernetes.io/projected/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-kube-api-access-7krzj\") pod \"redhat-operators-kpclw\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.268220 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:30 crc kubenswrapper[4830]: W0311 09:39:30.793862 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod679a3e99_cdd1_4ae4_b4f5_e7b7ebe55efc.slice/crio-118c4fb89902ef29658f71706313999170729705c84295d950d5909a886737fd WatchSource:0}: Error finding container 118c4fb89902ef29658f71706313999170729705c84295d950d5909a886737fd: Status 404 returned error can't find the container with id 118c4fb89902ef29658f71706313999170729705c84295d950d5909a886737fd Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.794060 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpclw"] Mar 11 09:39:30 crc kubenswrapper[4830]: I0311 09:39:30.975415 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpclw" event={"ID":"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc","Type":"ContainerStarted","Data":"118c4fb89902ef29658f71706313999170729705c84295d950d5909a886737fd"} Mar 11 09:39:31 crc kubenswrapper[4830]: I0311 09:39:31.985925 4830 generic.go:334] "Generic (PLEG): container finished" podID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerID="8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561" exitCode=0 Mar 11 09:39:31 crc kubenswrapper[4830]: I0311 09:39:31.985999 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpclw" event={"ID":"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc","Type":"ContainerDied","Data":"8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561"} Mar 11 09:39:34 crc kubenswrapper[4830]: I0311 09:39:34.005881 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpclw" event={"ID":"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc","Type":"ContainerStarted","Data":"2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554"} Mar 11 09:39:40 crc kubenswrapper[4830]: I0311 09:39:40.064386 4830 generic.go:334] "Generic (PLEG): container finished" podID="d57e6a98-80e8-40a0-af5d-56d936e6ab67" containerID="0445f0e418ec8f6cf0c70a2d8bcc0a41339984dabadaeabbe66c8acce87af910" exitCode=0 Mar 11 09:39:40 crc kubenswrapper[4830]: I0311 09:39:40.064483 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" event={"ID":"d57e6a98-80e8-40a0-af5d-56d936e6ab67","Type":"ContainerDied","Data":"0445f0e418ec8f6cf0c70a2d8bcc0a41339984dabadaeabbe66c8acce87af910"} Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.626304 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.749951 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-repo-setup-combined-ca-bundle\") pod \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.750443 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-inventory\") pod \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.750560 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whd7g\" (UniqueName: \"kubernetes.io/projected/d57e6a98-80e8-40a0-af5d-56d936e6ab67-kube-api-access-whd7g\") pod \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.750613 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-ssh-key-openstack-edpm-ipam\") pod \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\" (UID: \"d57e6a98-80e8-40a0-af5d-56d936e6ab67\") " Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.762442 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d57e6a98-80e8-40a0-af5d-56d936e6ab67" (UID: "d57e6a98-80e8-40a0-af5d-56d936e6ab67"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.768336 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57e6a98-80e8-40a0-af5d-56d936e6ab67-kube-api-access-whd7g" (OuterVolumeSpecName: "kube-api-access-whd7g") pod "d57e6a98-80e8-40a0-af5d-56d936e6ab67" (UID: "d57e6a98-80e8-40a0-af5d-56d936e6ab67"). InnerVolumeSpecName "kube-api-access-whd7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.785312 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d57e6a98-80e8-40a0-af5d-56d936e6ab67" (UID: "d57e6a98-80e8-40a0-af5d-56d936e6ab67"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.785796 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-inventory" (OuterVolumeSpecName: "inventory") pod "d57e6a98-80e8-40a0-af5d-56d936e6ab67" (UID: "d57e6a98-80e8-40a0-af5d-56d936e6ab67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.853012 4830 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.853128 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.853144 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whd7g\" (UniqueName: \"kubernetes.io/projected/d57e6a98-80e8-40a0-af5d-56d936e6ab67-kube-api-access-whd7g\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:41 crc kubenswrapper[4830]: I0311 09:39:41.853156 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d57e6a98-80e8-40a0-af5d-56d936e6ab67-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.084544 4830 generic.go:334] "Generic (PLEG): container finished" podID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerID="2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554" exitCode=0 Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.084629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpclw" event={"ID":"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc","Type":"ContainerDied","Data":"2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554"} Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.088140 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" event={"ID":"d57e6a98-80e8-40a0-af5d-56d936e6ab67","Type":"ContainerDied","Data":"1ebc6c83e7c735b31868da9dbf2665020a974688e1d91e62a527011ce9059fdc"} Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.088167 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ebc6c83e7c735b31868da9dbf2665020a974688e1d91e62a527011ce9059fdc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.088171 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.192222 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc"] Mar 11 09:39:42 crc kubenswrapper[4830]: E0311 09:39:42.192708 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57e6a98-80e8-40a0-af5d-56d936e6ab67" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.192733 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57e6a98-80e8-40a0-af5d-56d936e6ab67" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.192998 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57e6a98-80e8-40a0-af5d-56d936e6ab67" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.193810 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.197537 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.197939 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.198233 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.198450 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.216526 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc"] Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.272447 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.272530 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmcd7\" (UniqueName: \"kubernetes.io/projected/2b0b1934-6dd3-441c-923d-67b9ed28a177-kube-api-access-wmcd7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.272590 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.374731 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.374817 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmcd7\" (UniqueName: \"kubernetes.io/projected/2b0b1934-6dd3-441c-923d-67b9ed28a177-kube-api-access-wmcd7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.374864 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.379916 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.381577 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.398229 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmcd7\" (UniqueName: \"kubernetes.io/projected/2b0b1934-6dd3-441c-923d-67b9ed28a177-kube-api-access-wmcd7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mrqxc\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:42 crc kubenswrapper[4830]: I0311 09:39:42.510662 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:43 crc kubenswrapper[4830]: I0311 09:39:43.060980 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:39:43 crc kubenswrapper[4830]: I0311 09:39:43.061367 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:39:43 crc kubenswrapper[4830]: I0311 09:39:43.071691 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc"] Mar 11 09:39:43 crc kubenswrapper[4830]: W0311 09:39:43.078489 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b0b1934_6dd3_441c_923d_67b9ed28a177.slice/crio-aba412131aef6a032964be15918dd2a67d068cb2792f2cc6e8c49e7ebea0aa30 WatchSource:0}: Error finding container aba412131aef6a032964be15918dd2a67d068cb2792f2cc6e8c49e7ebea0aa30: Status 404 returned error can't find the container with id aba412131aef6a032964be15918dd2a67d068cb2792f2cc6e8c49e7ebea0aa30 Mar 11 09:39:43 crc kubenswrapper[4830]: I0311 09:39:43.102737 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpclw" event={"ID":"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc","Type":"ContainerStarted","Data":"727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584"} Mar 11 09:39:43 crc kubenswrapper[4830]: I0311 09:39:43.104730 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" event={"ID":"2b0b1934-6dd3-441c-923d-67b9ed28a177","Type":"ContainerStarted","Data":"aba412131aef6a032964be15918dd2a67d068cb2792f2cc6e8c49e7ebea0aa30"} Mar 11 09:39:43 crc kubenswrapper[4830]: I0311 09:39:43.128779 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpclw" podStartSLOduration=3.526373051 podStartE2EDuration="14.128755517s" podCreationTimestamp="2026-03-11 09:39:29 +0000 UTC" firstStartedPulling="2026-03-11 09:39:31.988602577 +0000 UTC m=+1539.769753276" lastFinishedPulling="2026-03-11 09:39:42.590985053 +0000 UTC m=+1550.372135742" observedRunningTime="2026-03-11 09:39:43.121578599 +0000 UTC m=+1550.902729298" watchObservedRunningTime="2026-03-11 09:39:43.128755517 +0000 UTC m=+1550.909906206" Mar 11 09:39:44 crc kubenswrapper[4830]: I0311 09:39:44.120569 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" event={"ID":"2b0b1934-6dd3-441c-923d-67b9ed28a177","Type":"ContainerStarted","Data":"938dbd01407ae8ede26642c12fc37a701e4bd6b3bcc65650abfeadaa6a6833aa"} Mar 11 09:39:44 crc kubenswrapper[4830]: I0311 09:39:44.163921 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" podStartSLOduration=1.705791421 podStartE2EDuration="2.163895784s" podCreationTimestamp="2026-03-11 09:39:42 +0000 UTC" firstStartedPulling="2026-03-11 09:39:43.081140267 +0000 UTC m=+1550.862290956" lastFinishedPulling="2026-03-11 09:39:43.53924463 +0000 UTC m=+1551.320395319" observedRunningTime="2026-03-11 09:39:44.138757592 +0000 UTC m=+1551.919908331" watchObservedRunningTime="2026-03-11 09:39:44.163895784 +0000 UTC m=+1551.945046473" Mar 11 09:39:46 crc kubenswrapper[4830]: I0311 09:39:46.136769 4830 generic.go:334] "Generic (PLEG): container finished" podID="2b0b1934-6dd3-441c-923d-67b9ed28a177" containerID="938dbd01407ae8ede26642c12fc37a701e4bd6b3bcc65650abfeadaa6a6833aa" exitCode=0 Mar 11 09:39:46 crc kubenswrapper[4830]: I0311 09:39:46.136817 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" event={"ID":"2b0b1934-6dd3-441c-923d-67b9ed28a177","Type":"ContainerDied","Data":"938dbd01407ae8ede26642c12fc37a701e4bd6b3bcc65650abfeadaa6a6833aa"} Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.529267 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.673623 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-inventory\") pod \"2b0b1934-6dd3-441c-923d-67b9ed28a177\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.673734 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-ssh-key-openstack-edpm-ipam\") pod \"2b0b1934-6dd3-441c-923d-67b9ed28a177\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.673834 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmcd7\" (UniqueName: \"kubernetes.io/projected/2b0b1934-6dd3-441c-923d-67b9ed28a177-kube-api-access-wmcd7\") pod \"2b0b1934-6dd3-441c-923d-67b9ed28a177\" (UID: \"2b0b1934-6dd3-441c-923d-67b9ed28a177\") " Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.680322 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b0b1934-6dd3-441c-923d-67b9ed28a177-kube-api-access-wmcd7" (OuterVolumeSpecName: "kube-api-access-wmcd7") pod "2b0b1934-6dd3-441c-923d-67b9ed28a177" (UID: "2b0b1934-6dd3-441c-923d-67b9ed28a177"). InnerVolumeSpecName "kube-api-access-wmcd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.704104 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-inventory" (OuterVolumeSpecName: "inventory") pod "2b0b1934-6dd3-441c-923d-67b9ed28a177" (UID: "2b0b1934-6dd3-441c-923d-67b9ed28a177"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.704150 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2b0b1934-6dd3-441c-923d-67b9ed28a177" (UID: "2b0b1934-6dd3-441c-923d-67b9ed28a177"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.777209 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.777246 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2b0b1934-6dd3-441c-923d-67b9ed28a177-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:47 crc kubenswrapper[4830]: I0311 09:39:47.777309 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmcd7\" (UniqueName: \"kubernetes.io/projected/2b0b1934-6dd3-441c-923d-67b9ed28a177-kube-api-access-wmcd7\") on node \"crc\" DevicePath \"\"" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.156456 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" event={"ID":"2b0b1934-6dd3-441c-923d-67b9ed28a177","Type":"ContainerDied","Data":"aba412131aef6a032964be15918dd2a67d068cb2792f2cc6e8c49e7ebea0aa30"} Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.156494 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba412131aef6a032964be15918dd2a67d068cb2792f2cc6e8c49e7ebea0aa30" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.156559 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mrqxc" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.229209 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d"] Mar 11 09:39:48 crc kubenswrapper[4830]: E0311 09:39:48.229597 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b0b1934-6dd3-441c-923d-67b9ed28a177" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.229614 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b0b1934-6dd3-441c-923d-67b9ed28a177" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.229807 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b0b1934-6dd3-441c-923d-67b9ed28a177" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.230504 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.232131 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.232339 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.232595 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.232990 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.237483 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d"] Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.286815 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.286936 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6snb9\" (UniqueName: \"kubernetes.io/projected/751133b2-5530-48d1-9cb0-4e69aadf979a-kube-api-access-6snb9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.286983 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.287048 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.388189 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6snb9\" (UniqueName: \"kubernetes.io/projected/751133b2-5530-48d1-9cb0-4e69aadf979a-kube-api-access-6snb9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.388273 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.388310 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.388408 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.393128 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.400785 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.400957 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.408367 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6snb9\" (UniqueName: \"kubernetes.io/projected/751133b2-5530-48d1-9cb0-4e69aadf979a-kube-api-access-6snb9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:48 crc kubenswrapper[4830]: I0311 09:39:48.546740 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:39:49 crc kubenswrapper[4830]: I0311 09:39:49.085562 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d"] Mar 11 09:39:49 crc kubenswrapper[4830]: I0311 09:39:49.168457 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" event={"ID":"751133b2-5530-48d1-9cb0-4e69aadf979a","Type":"ContainerStarted","Data":"95eaf439e64d209d93af6ef296c470ae8b289e49238308d6da4d3261eaf6107f"} Mar 11 09:39:50 crc kubenswrapper[4830]: I0311 09:39:50.179011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" event={"ID":"751133b2-5530-48d1-9cb0-4e69aadf979a","Type":"ContainerStarted","Data":"88054b8385a79bcda7195e05905b1f3fa5fcdaa57f23efbcc879d9b9f2c3e83f"} Mar 11 09:39:50 crc kubenswrapper[4830]: I0311 09:39:50.214845 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" podStartSLOduration=1.823798735 podStartE2EDuration="2.214821961s" podCreationTimestamp="2026-03-11 09:39:48 +0000 UTC" firstStartedPulling="2026-03-11 09:39:49.096742004 +0000 UTC m=+1556.877892693" lastFinishedPulling="2026-03-11 09:39:49.48776523 +0000 UTC m=+1557.268915919" observedRunningTime="2026-03-11 09:39:50.195636334 +0000 UTC m=+1557.976787043" watchObservedRunningTime="2026-03-11 09:39:50.214821961 +0000 UTC m=+1557.995972650" Mar 11 09:39:50 crc kubenswrapper[4830]: I0311 09:39:50.268597 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:50 crc kubenswrapper[4830]: I0311 09:39:50.268680 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:39:51 crc kubenswrapper[4830]: I0311 09:39:51.320438 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kpclw" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="registry-server" probeResult="failure" output=< Mar 11 09:39:51 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 09:39:51 crc kubenswrapper[4830]: > Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.147537 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553700-smlzg"] Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.149907 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553700-smlzg" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.152387 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.152606 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.153265 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.156239 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553700-smlzg"] Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.227329 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5wqx\" (UniqueName: \"kubernetes.io/projected/bdff209b-dbf7-4005-b5c6-66091cdfebc0-kube-api-access-d5wqx\") pod \"auto-csr-approver-29553700-smlzg\" (UID: \"bdff209b-dbf7-4005-b5c6-66091cdfebc0\") " pod="openshift-infra/auto-csr-approver-29553700-smlzg" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.330003 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5wqx\" (UniqueName: \"kubernetes.io/projected/bdff209b-dbf7-4005-b5c6-66091cdfebc0-kube-api-access-d5wqx\") pod \"auto-csr-approver-29553700-smlzg\" (UID: \"bdff209b-dbf7-4005-b5c6-66091cdfebc0\") " pod="openshift-infra/auto-csr-approver-29553700-smlzg" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.349005 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5wqx\" (UniqueName: \"kubernetes.io/projected/bdff209b-dbf7-4005-b5c6-66091cdfebc0-kube-api-access-d5wqx\") pod \"auto-csr-approver-29553700-smlzg\" (UID: \"bdff209b-dbf7-4005-b5c6-66091cdfebc0\") " pod="openshift-infra/auto-csr-approver-29553700-smlzg" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.475320 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553700-smlzg" Mar 11 09:40:00 crc kubenswrapper[4830]: I0311 09:40:00.945662 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553700-smlzg"] Mar 11 09:40:00 crc kubenswrapper[4830]: W0311 09:40:00.946776 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdff209b_dbf7_4005_b5c6_66091cdfebc0.slice/crio-63f31478ae96c4a1f1792eac6ac98f6897c8e873d909c7f94d892d3c7181d7f8 WatchSource:0}: Error finding container 63f31478ae96c4a1f1792eac6ac98f6897c8e873d909c7f94d892d3c7181d7f8: Status 404 returned error can't find the container with id 63f31478ae96c4a1f1792eac6ac98f6897c8e873d909c7f94d892d3c7181d7f8 Mar 11 09:40:01 crc kubenswrapper[4830]: I0311 09:40:01.292548 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553700-smlzg" event={"ID":"bdff209b-dbf7-4005-b5c6-66091cdfebc0","Type":"ContainerStarted","Data":"63f31478ae96c4a1f1792eac6ac98f6897c8e873d909c7f94d892d3c7181d7f8"} Mar 11 09:40:01 crc kubenswrapper[4830]: I0311 09:40:01.314662 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kpclw" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="registry-server" probeResult="failure" output=< Mar 11 09:40:01 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 09:40:01 crc kubenswrapper[4830]: > Mar 11 09:40:01 crc kubenswrapper[4830]: I0311 09:40:01.962999 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sdfbn"] Mar 11 09:40:01 crc kubenswrapper[4830]: I0311 09:40:01.969536 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:01 crc kubenswrapper[4830]: I0311 09:40:01.975448 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdfbn"] Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.063205 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f470d035-fb23-4ecc-b36e-b61886bfab43-utilities\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.063649 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f470d035-fb23-4ecc-b36e-b61886bfab43-catalog-content\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.063856 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9h4k\" (UniqueName: \"kubernetes.io/projected/f470d035-fb23-4ecc-b36e-b61886bfab43-kube-api-access-j9h4k\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.165506 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9h4k\" (UniqueName: \"kubernetes.io/projected/f470d035-fb23-4ecc-b36e-b61886bfab43-kube-api-access-j9h4k\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.165686 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f470d035-fb23-4ecc-b36e-b61886bfab43-utilities\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.165717 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f470d035-fb23-4ecc-b36e-b61886bfab43-catalog-content\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.166429 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f470d035-fb23-4ecc-b36e-b61886bfab43-catalog-content\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.166530 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f470d035-fb23-4ecc-b36e-b61886bfab43-utilities\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.190083 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9h4k\" (UniqueName: \"kubernetes.io/projected/f470d035-fb23-4ecc-b36e-b61886bfab43-kube-api-access-j9h4k\") pod \"certified-operators-sdfbn\" (UID: \"f470d035-fb23-4ecc-b36e-b61886bfab43\") " pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.290370 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:02 crc kubenswrapper[4830]: I0311 09:40:02.786643 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdfbn"] Mar 11 09:40:02 crc kubenswrapper[4830]: W0311 09:40:02.803620 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf470d035_fb23_4ecc_b36e_b61886bfab43.slice/crio-27f8c8098d2dedfd0d34480852e7c51ebd4b9e953fca856189d3d3b4677e46e3 WatchSource:0}: Error finding container 27f8c8098d2dedfd0d34480852e7c51ebd4b9e953fca856189d3d3b4677e46e3: Status 404 returned error can't find the container with id 27f8c8098d2dedfd0d34480852e7c51ebd4b9e953fca856189d3d3b4677e46e3 Mar 11 09:40:03 crc kubenswrapper[4830]: I0311 09:40:03.314889 4830 generic.go:334] "Generic (PLEG): container finished" podID="bdff209b-dbf7-4005-b5c6-66091cdfebc0" containerID="f5377fb376919a03714090ebd377aa81fa3c44a1c34ef7872ff669a17ba1b3f6" exitCode=0 Mar 11 09:40:03 crc kubenswrapper[4830]: I0311 09:40:03.314976 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553700-smlzg" event={"ID":"bdff209b-dbf7-4005-b5c6-66091cdfebc0","Type":"ContainerDied","Data":"f5377fb376919a03714090ebd377aa81fa3c44a1c34ef7872ff669a17ba1b3f6"} Mar 11 09:40:03 crc kubenswrapper[4830]: I0311 09:40:03.316483 4830 generic.go:334] "Generic (PLEG): container finished" podID="f470d035-fb23-4ecc-b36e-b61886bfab43" containerID="c979992b77a6cf74e121db5559d264a60ada09c79a8a5c5b727b3d87cf641b16" exitCode=0 Mar 11 09:40:03 crc kubenswrapper[4830]: I0311 09:40:03.316510 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdfbn" event={"ID":"f470d035-fb23-4ecc-b36e-b61886bfab43","Type":"ContainerDied","Data":"c979992b77a6cf74e121db5559d264a60ada09c79a8a5c5b727b3d87cf641b16"} Mar 11 09:40:03 crc kubenswrapper[4830]: I0311 09:40:03.316525 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdfbn" event={"ID":"f470d035-fb23-4ecc-b36e-b61886bfab43","Type":"ContainerStarted","Data":"27f8c8098d2dedfd0d34480852e7c51ebd4b9e953fca856189d3d3b4677e46e3"} Mar 11 09:40:04 crc kubenswrapper[4830]: I0311 09:40:04.720146 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553700-smlzg" Mar 11 09:40:04 crc kubenswrapper[4830]: I0311 09:40:04.843709 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5wqx\" (UniqueName: \"kubernetes.io/projected/bdff209b-dbf7-4005-b5c6-66091cdfebc0-kube-api-access-d5wqx\") pod \"bdff209b-dbf7-4005-b5c6-66091cdfebc0\" (UID: \"bdff209b-dbf7-4005-b5c6-66091cdfebc0\") " Mar 11 09:40:04 crc kubenswrapper[4830]: I0311 09:40:04.851556 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdff209b-dbf7-4005-b5c6-66091cdfebc0-kube-api-access-d5wqx" (OuterVolumeSpecName: "kube-api-access-d5wqx") pod "bdff209b-dbf7-4005-b5c6-66091cdfebc0" (UID: "bdff209b-dbf7-4005-b5c6-66091cdfebc0"). InnerVolumeSpecName "kube-api-access-d5wqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:40:04 crc kubenswrapper[4830]: I0311 09:40:04.946108 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5wqx\" (UniqueName: \"kubernetes.io/projected/bdff209b-dbf7-4005-b5c6-66091cdfebc0-kube-api-access-d5wqx\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.157419 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ld6pc"] Mar 11 09:40:05 crc kubenswrapper[4830]: E0311 09:40:05.158438 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdff209b-dbf7-4005-b5c6-66091cdfebc0" containerName="oc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.158459 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdff209b-dbf7-4005-b5c6-66091cdfebc0" containerName="oc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.158712 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdff209b-dbf7-4005-b5c6-66091cdfebc0" containerName="oc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.160246 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.175125 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ld6pc"] Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.250220 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-utilities\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.250552 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7ff\" (UniqueName: \"kubernetes.io/projected/bb1f2a22-9450-4112-9a20-d8b650cb410b-kube-api-access-4m7ff\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.250613 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-catalog-content\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.336300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553700-smlzg" event={"ID":"bdff209b-dbf7-4005-b5c6-66091cdfebc0","Type":"ContainerDied","Data":"63f31478ae96c4a1f1792eac6ac98f6897c8e873d909c7f94d892d3c7181d7f8"} Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.336344 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63f31478ae96c4a1f1792eac6ac98f6897c8e873d909c7f94d892d3c7181d7f8" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.336398 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553700-smlzg" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.351955 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7ff\" (UniqueName: \"kubernetes.io/projected/bb1f2a22-9450-4112-9a20-d8b650cb410b-kube-api-access-4m7ff\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.352016 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-catalog-content\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.352190 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-utilities\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.352721 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-utilities\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.353463 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-catalog-content\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.393900 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7ff\" (UniqueName: \"kubernetes.io/projected/bb1f2a22-9450-4112-9a20-d8b650cb410b-kube-api-access-4m7ff\") pod \"redhat-marketplace-ld6pc\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.493609 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.805371 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553694-f9xwg"] Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.827592 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553694-f9xwg"] Mar 11 09:40:05 crc kubenswrapper[4830]: I0311 09:40:05.985745 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ld6pc"] Mar 11 09:40:05 crc kubenswrapper[4830]: W0311 09:40:05.994555 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb1f2a22_9450_4112_9a20_d8b650cb410b.slice/crio-bf10e1e68059884c234ffc66a9938ff647d522313050fb665b62f4468fc2c2a3 WatchSource:0}: Error finding container bf10e1e68059884c234ffc66a9938ff647d522313050fb665b62f4468fc2c2a3: Status 404 returned error can't find the container with id bf10e1e68059884c234ffc66a9938ff647d522313050fb665b62f4468fc2c2a3 Mar 11 09:40:06 crc kubenswrapper[4830]: I0311 09:40:06.345779 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ld6pc" event={"ID":"bb1f2a22-9450-4112-9a20-d8b650cb410b","Type":"ContainerStarted","Data":"bf10e1e68059884c234ffc66a9938ff647d522313050fb665b62f4468fc2c2a3"} Mar 11 09:40:06 crc kubenswrapper[4830]: I0311 09:40:06.953325 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74000d07-4644-41b1-90d3-0de67ed840c7" path="/var/lib/kubelet/pods/74000d07-4644-41b1-90d3-0de67ed840c7/volumes" Mar 11 09:40:07 crc kubenswrapper[4830]: I0311 09:40:07.359722 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerID="3cac5c22189c5b9f02535366a27c20ffb1cd5c518b9a96545724314f665c87d7" exitCode=0 Mar 11 09:40:07 crc kubenswrapper[4830]: I0311 09:40:07.359832 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ld6pc" event={"ID":"bb1f2a22-9450-4112-9a20-d8b650cb410b","Type":"ContainerDied","Data":"3cac5c22189c5b9f02535366a27c20ffb1cd5c518b9a96545724314f665c87d7"} Mar 11 09:40:07 crc kubenswrapper[4830]: I0311 09:40:07.806584 4830 scope.go:117] "RemoveContainer" containerID="77716cbaf234aa7148b92e8585531c510c9e63187c20a1e92cb8512464b65857" Mar 11 09:40:09 crc kubenswrapper[4830]: I0311 09:40:09.142829 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:40:10 crc kubenswrapper[4830]: I0311 09:40:10.407218 4830 generic.go:334] "Generic (PLEG): container finished" podID="f470d035-fb23-4ecc-b36e-b61886bfab43" containerID="655df6951dc861a0af898da1f79e91b933bbdd90ae8f2c02660b78e91ffb8476" exitCode=0 Mar 11 09:40:10 crc kubenswrapper[4830]: I0311 09:40:10.407291 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdfbn" event={"ID":"f470d035-fb23-4ecc-b36e-b61886bfab43","Type":"ContainerDied","Data":"655df6951dc861a0af898da1f79e91b933bbdd90ae8f2c02660b78e91ffb8476"} Mar 11 09:40:11 crc kubenswrapper[4830]: I0311 09:40:11.341672 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kpclw" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="registry-server" probeResult="failure" output=< Mar 11 09:40:11 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 09:40:11 crc kubenswrapper[4830]: > Mar 11 09:40:11 crc kubenswrapper[4830]: I0311 09:40:11.426487 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ld6pc" event={"ID":"bb1f2a22-9450-4112-9a20-d8b650cb410b","Type":"ContainerStarted","Data":"24121f8b41de53a0a961e2d02e5ea4a1dded98e808056dfd940b30aa6b6beadd"} Mar 11 09:40:12 crc kubenswrapper[4830]: I0311 09:40:12.440243 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerID="24121f8b41de53a0a961e2d02e5ea4a1dded98e808056dfd940b30aa6b6beadd" exitCode=0 Mar 11 09:40:12 crc kubenswrapper[4830]: I0311 09:40:12.440298 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ld6pc" event={"ID":"bb1f2a22-9450-4112-9a20-d8b650cb410b","Type":"ContainerDied","Data":"24121f8b41de53a0a961e2d02e5ea4a1dded98e808056dfd940b30aa6b6beadd"} Mar 11 09:40:12 crc kubenswrapper[4830]: I0311 09:40:12.446214 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdfbn" event={"ID":"f470d035-fb23-4ecc-b36e-b61886bfab43","Type":"ContainerStarted","Data":"230bbc142907256631b94ed24a54b186da73e394ec6781e7517e7b0bfe8e527f"} Mar 11 09:40:12 crc kubenswrapper[4830]: I0311 09:40:12.487148 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sdfbn" podStartSLOduration=3.723009684 podStartE2EDuration="11.487128071s" podCreationTimestamp="2026-03-11 09:40:01 +0000 UTC" firstStartedPulling="2026-03-11 09:40:03.318355932 +0000 UTC m=+1571.099506611" lastFinishedPulling="2026-03-11 09:40:11.082474269 +0000 UTC m=+1578.863624998" observedRunningTime="2026-03-11 09:40:12.481974669 +0000 UTC m=+1580.263125368" watchObservedRunningTime="2026-03-11 09:40:12.487128071 +0000 UTC m=+1580.268278760" Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.059944 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.060000 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.060057 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.060805 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.060861 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" gracePeriod=600 Mar 11 09:40:13 crc kubenswrapper[4830]: E0311 09:40:13.178693 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.458230 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" exitCode=0 Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.458300 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f"} Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.458355 4830 scope.go:117] "RemoveContainer" containerID="6d51cf8acf1c408e7829c31a89fc6bf74196f438e8b371de9aaedaab30e9cfc5" Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.458894 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:40:13 crc kubenswrapper[4830]: E0311 09:40:13.459162 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.467134 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ld6pc" event={"ID":"bb1f2a22-9450-4112-9a20-d8b650cb410b","Type":"ContainerStarted","Data":"1dd692080f8bc78c98ab847e01a17d943ce738b215d3a5cb9e164c860b5aadf9"} Mar 11 09:40:13 crc kubenswrapper[4830]: I0311 09:40:13.539544 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ld6pc" podStartSLOduration=4.844957075 podStartE2EDuration="8.539523622s" podCreationTimestamp="2026-03-11 09:40:05 +0000 UTC" firstStartedPulling="2026-03-11 09:40:09.142587164 +0000 UTC m=+1576.923737853" lastFinishedPulling="2026-03-11 09:40:12.837153711 +0000 UTC m=+1580.618304400" observedRunningTime="2026-03-11 09:40:13.537673731 +0000 UTC m=+1581.318824460" watchObservedRunningTime="2026-03-11 09:40:13.539523622 +0000 UTC m=+1581.320674311" Mar 11 09:40:15 crc kubenswrapper[4830]: I0311 09:40:15.494131 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:15 crc kubenswrapper[4830]: I0311 09:40:15.496583 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:15 crc kubenswrapper[4830]: I0311 09:40:15.547313 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.145973 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8jn8f"] Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.148616 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.166905 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jn8f"] Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.241970 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-utilities\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.242366 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g2kj\" (UniqueName: \"kubernetes.io/projected/fa5b65bd-2388-4817-a862-146906fce050-kube-api-access-9g2kj\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.242586 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-catalog-content\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.316389 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.344132 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g2kj\" (UniqueName: \"kubernetes.io/projected/fa5b65bd-2388-4817-a862-146906fce050-kube-api-access-9g2kj\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.344280 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-catalog-content\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.344350 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-utilities\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.344816 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-catalog-content\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.345184 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-utilities\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.364273 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g2kj\" (UniqueName: \"kubernetes.io/projected/fa5b65bd-2388-4817-a862-146906fce050-kube-api-access-9g2kj\") pod \"community-operators-8jn8f\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.371589 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:40:20 crc kubenswrapper[4830]: I0311 09:40:20.471743 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:21 crc kubenswrapper[4830]: I0311 09:40:21.069900 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8jn8f"] Mar 11 09:40:21 crc kubenswrapper[4830]: I0311 09:40:21.548590 4830 generic.go:334] "Generic (PLEG): container finished" podID="fa5b65bd-2388-4817-a862-146906fce050" containerID="9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0" exitCode=0 Mar 11 09:40:21 crc kubenswrapper[4830]: I0311 09:40:21.548705 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jn8f" event={"ID":"fa5b65bd-2388-4817-a862-146906fce050","Type":"ContainerDied","Data":"9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0"} Mar 11 09:40:21 crc kubenswrapper[4830]: I0311 09:40:21.548917 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jn8f" event={"ID":"fa5b65bd-2388-4817-a862-146906fce050","Type":"ContainerStarted","Data":"886364c872485a56f7cf05f3e80795fefed1c5ff74530a9b71b9cd7342c52885"} Mar 11 09:40:22 crc kubenswrapper[4830]: I0311 09:40:22.291266 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:22 crc kubenswrapper[4830]: I0311 09:40:22.291333 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:22 crc kubenswrapper[4830]: I0311 09:40:22.356771 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:22 crc kubenswrapper[4830]: I0311 09:40:22.564159 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jn8f" event={"ID":"fa5b65bd-2388-4817-a862-146906fce050","Type":"ContainerStarted","Data":"6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910"} Mar 11 09:40:22 crc kubenswrapper[4830]: I0311 09:40:22.622099 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sdfbn" Mar 11 09:40:22 crc kubenswrapper[4830]: I0311 09:40:22.728592 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpclw"] Mar 11 09:40:22 crc kubenswrapper[4830]: I0311 09:40:22.728870 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kpclw" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="registry-server" containerID="cri-o://727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584" gracePeriod=2 Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.195220 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.324551 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-utilities\") pod \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.324822 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-catalog-content\") pod \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.324934 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7krzj\" (UniqueName: \"kubernetes.io/projected/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-kube-api-access-7krzj\") pod \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\" (UID: \"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc\") " Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.326665 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-utilities" (OuterVolumeSpecName: "utilities") pod "679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" (UID: "679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.330840 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-kube-api-access-7krzj" (OuterVolumeSpecName: "kube-api-access-7krzj") pod "679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" (UID: "679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc"). InnerVolumeSpecName "kube-api-access-7krzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.427404 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7krzj\" (UniqueName: \"kubernetes.io/projected/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-kube-api-access-7krzj\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.427431 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.437824 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" (UID: "679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.528860 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.575260 4830 generic.go:334] "Generic (PLEG): container finished" podID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerID="727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584" exitCode=0 Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.575345 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpclw" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.575355 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpclw" event={"ID":"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc","Type":"ContainerDied","Data":"727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584"} Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.575406 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpclw" event={"ID":"679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc","Type":"ContainerDied","Data":"118c4fb89902ef29658f71706313999170729705c84295d950d5909a886737fd"} Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.575429 4830 scope.go:117] "RemoveContainer" containerID="727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.581676 4830 generic.go:334] "Generic (PLEG): container finished" podID="fa5b65bd-2388-4817-a862-146906fce050" containerID="6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910" exitCode=0 Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.581771 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jn8f" event={"ID":"fa5b65bd-2388-4817-a862-146906fce050","Type":"ContainerDied","Data":"6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910"} Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.626549 4830 scope.go:117] "RemoveContainer" containerID="2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.628644 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpclw"] Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.635748 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kpclw"] Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.664967 4830 scope.go:117] "RemoveContainer" containerID="8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.706764 4830 scope.go:117] "RemoveContainer" containerID="727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584" Mar 11 09:40:23 crc kubenswrapper[4830]: E0311 09:40:23.707279 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584\": container with ID starting with 727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584 not found: ID does not exist" containerID="727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.707324 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584"} err="failed to get container status \"727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584\": rpc error: code = NotFound desc = could not find container \"727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584\": container with ID starting with 727be01d098b4c03881c1abdc7e4ed622670e65eaf9cd3b20ff00357b6136584 not found: ID does not exist" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.707352 4830 scope.go:117] "RemoveContainer" containerID="2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554" Mar 11 09:40:23 crc kubenswrapper[4830]: E0311 09:40:23.707705 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554\": container with ID starting with 2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554 not found: ID does not exist" containerID="2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.707747 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554"} err="failed to get container status \"2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554\": rpc error: code = NotFound desc = could not find container \"2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554\": container with ID starting with 2e24acfe4d7d58f2537fb253bef4c5900ff9dd96913f7070e9b1aa2280468554 not found: ID does not exist" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.707772 4830 scope.go:117] "RemoveContainer" containerID="8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561" Mar 11 09:40:23 crc kubenswrapper[4830]: E0311 09:40:23.708039 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561\": container with ID starting with 8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561 not found: ID does not exist" containerID="8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561" Mar 11 09:40:23 crc kubenswrapper[4830]: I0311 09:40:23.708066 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561"} err="failed to get container status \"8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561\": rpc error: code = NotFound desc = could not find container \"8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561\": container with ID starting with 8be4c5ef1ab52020977ff7121dadabca1feb92a84b711b2e61bb713dfce49561 not found: ID does not exist" Mar 11 09:40:24 crc kubenswrapper[4830]: I0311 09:40:24.594901 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jn8f" event={"ID":"fa5b65bd-2388-4817-a862-146906fce050","Type":"ContainerStarted","Data":"c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8"} Mar 11 09:40:24 crc kubenswrapper[4830]: I0311 09:40:24.614954 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8jn8f" podStartSLOduration=1.993872957 podStartE2EDuration="4.614936371s" podCreationTimestamp="2026-03-11 09:40:20 +0000 UTC" firstStartedPulling="2026-03-11 09:40:21.550293375 +0000 UTC m=+1589.331444064" lastFinishedPulling="2026-03-11 09:40:24.171356779 +0000 UTC m=+1591.952507478" observedRunningTime="2026-03-11 09:40:24.609265645 +0000 UTC m=+1592.390416344" watchObservedRunningTime="2026-03-11 09:40:24.614936371 +0000 UTC m=+1592.396087060" Mar 11 09:40:24 crc kubenswrapper[4830]: I0311 09:40:24.742929 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdfbn"] Mar 11 09:40:24 crc kubenswrapper[4830]: I0311 09:40:24.943941 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" path="/var/lib/kubelet/pods/679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc/volumes" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.127375 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ljms"] Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.127661 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5ljms" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="registry-server" containerID="cri-o://db72a4b45d8609aa4b31c10f192a8c101361b5bbb2227097178de533a6c8c11e" gracePeriod=2 Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.559902 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.631943 4830 generic.go:334] "Generic (PLEG): container finished" podID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerID="db72a4b45d8609aa4b31c10f192a8c101361b5bbb2227097178de533a6c8c11e" exitCode=0 Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.632951 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljms" event={"ID":"2635fd4f-dc7a-4524-bf2d-6307f49363c9","Type":"ContainerDied","Data":"db72a4b45d8609aa4b31c10f192a8c101361b5bbb2227097178de533a6c8c11e"} Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.632985 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ljms" event={"ID":"2635fd4f-dc7a-4524-bf2d-6307f49363c9","Type":"ContainerDied","Data":"e120c111fd30f51277128c7972bf7915dc62c04b82b1ee25b116c486f095463d"} Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.633000 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e120c111fd30f51277128c7972bf7915dc62c04b82b1ee25b116c486f095463d" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.681250 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.780135 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-catalog-content\") pod \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.780515 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-utilities\") pod \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.780641 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb5kq\" (UniqueName: \"kubernetes.io/projected/2635fd4f-dc7a-4524-bf2d-6307f49363c9-kube-api-access-gb5kq\") pod \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\" (UID: \"2635fd4f-dc7a-4524-bf2d-6307f49363c9\") " Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.794595 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-utilities" (OuterVolumeSpecName: "utilities") pod "2635fd4f-dc7a-4524-bf2d-6307f49363c9" (UID: "2635fd4f-dc7a-4524-bf2d-6307f49363c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.795037 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2635fd4f-dc7a-4524-bf2d-6307f49363c9-kube-api-access-gb5kq" (OuterVolumeSpecName: "kube-api-access-gb5kq") pod "2635fd4f-dc7a-4524-bf2d-6307f49363c9" (UID: "2635fd4f-dc7a-4524-bf2d-6307f49363c9"). InnerVolumeSpecName "kube-api-access-gb5kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.882328 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.882362 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb5kq\" (UniqueName: \"kubernetes.io/projected/2635fd4f-dc7a-4524-bf2d-6307f49363c9-kube-api-access-gb5kq\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.918523 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2635fd4f-dc7a-4524-bf2d-6307f49363c9" (UID: "2635fd4f-dc7a-4524-bf2d-6307f49363c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:25 crc kubenswrapper[4830]: I0311 09:40:25.983612 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2635fd4f-dc7a-4524-bf2d-6307f49363c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:26 crc kubenswrapper[4830]: I0311 09:40:26.639918 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ljms" Mar 11 09:40:26 crc kubenswrapper[4830]: I0311 09:40:26.675091 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ljms"] Mar 11 09:40:26 crc kubenswrapper[4830]: I0311 09:40:26.684336 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5ljms"] Mar 11 09:40:26 crc kubenswrapper[4830]: I0311 09:40:26.932324 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:40:26 crc kubenswrapper[4830]: E0311 09:40:26.932580 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:40:26 crc kubenswrapper[4830]: I0311 09:40:26.942340 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" path="/var/lib/kubelet/pods/2635fd4f-dc7a-4524-bf2d-6307f49363c9/volumes" Mar 11 09:40:29 crc kubenswrapper[4830]: I0311 09:40:29.537891 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ld6pc"] Mar 11 09:40:29 crc kubenswrapper[4830]: I0311 09:40:29.538237 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ld6pc" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="registry-server" containerID="cri-o://1dd692080f8bc78c98ab847e01a17d943ce738b215d3a5cb9e164c860b5aadf9" gracePeriod=2 Mar 11 09:40:29 crc kubenswrapper[4830]: I0311 09:40:29.670373 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerID="1dd692080f8bc78c98ab847e01a17d943ce738b215d3a5cb9e164c860b5aadf9" exitCode=0 Mar 11 09:40:29 crc kubenswrapper[4830]: I0311 09:40:29.670449 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ld6pc" event={"ID":"bb1f2a22-9450-4112-9a20-d8b650cb410b","Type":"ContainerDied","Data":"1dd692080f8bc78c98ab847e01a17d943ce738b215d3a5cb9e164c860b5aadf9"} Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.010846 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.170548 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m7ff\" (UniqueName: \"kubernetes.io/projected/bb1f2a22-9450-4112-9a20-d8b650cb410b-kube-api-access-4m7ff\") pod \"bb1f2a22-9450-4112-9a20-d8b650cb410b\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.170922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-catalog-content\") pod \"bb1f2a22-9450-4112-9a20-d8b650cb410b\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.171114 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-utilities\") pod \"bb1f2a22-9450-4112-9a20-d8b650cb410b\" (UID: \"bb1f2a22-9450-4112-9a20-d8b650cb410b\") " Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.171964 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-utilities" (OuterVolumeSpecName: "utilities") pod "bb1f2a22-9450-4112-9a20-d8b650cb410b" (UID: "bb1f2a22-9450-4112-9a20-d8b650cb410b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.175980 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb1f2a22-9450-4112-9a20-d8b650cb410b-kube-api-access-4m7ff" (OuterVolumeSpecName: "kube-api-access-4m7ff") pod "bb1f2a22-9450-4112-9a20-d8b650cb410b" (UID: "bb1f2a22-9450-4112-9a20-d8b650cb410b"). InnerVolumeSpecName "kube-api-access-4m7ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.193142 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb1f2a22-9450-4112-9a20-d8b650cb410b" (UID: "bb1f2a22-9450-4112-9a20-d8b650cb410b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.273210 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.273249 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb1f2a22-9450-4112-9a20-d8b650cb410b-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.273259 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m7ff\" (UniqueName: \"kubernetes.io/projected/bb1f2a22-9450-4112-9a20-d8b650cb410b-kube-api-access-4m7ff\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.472374 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.472445 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.525194 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.681213 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ld6pc" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.681207 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ld6pc" event={"ID":"bb1f2a22-9450-4112-9a20-d8b650cb410b","Type":"ContainerDied","Data":"bf10e1e68059884c234ffc66a9938ff647d522313050fb665b62f4468fc2c2a3"} Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.681294 4830 scope.go:117] "RemoveContainer" containerID="1dd692080f8bc78c98ab847e01a17d943ce738b215d3a5cb9e164c860b5aadf9" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.721170 4830 scope.go:117] "RemoveContainer" containerID="24121f8b41de53a0a961e2d02e5ea4a1dded98e808056dfd940b30aa6b6beadd" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.728747 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ld6pc"] Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.734967 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.738503 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ld6pc"] Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.763129 4830 scope.go:117] "RemoveContainer" containerID="3cac5c22189c5b9f02535366a27c20ffb1cd5c518b9a96545724314f665c87d7" Mar 11 09:40:30 crc kubenswrapper[4830]: I0311 09:40:30.944834 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" path="/var/lib/kubelet/pods/bb1f2a22-9450-4112-9a20-d8b650cb410b/volumes" Mar 11 09:40:31 crc kubenswrapper[4830]: I0311 09:40:31.928331 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jn8f"] Mar 11 09:40:32 crc kubenswrapper[4830]: I0311 09:40:32.706551 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8jn8f" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="registry-server" containerID="cri-o://c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8" gracePeriod=2 Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.184059 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.336773 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-utilities\") pod \"fa5b65bd-2388-4817-a862-146906fce050\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.336908 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g2kj\" (UniqueName: \"kubernetes.io/projected/fa5b65bd-2388-4817-a862-146906fce050-kube-api-access-9g2kj\") pod \"fa5b65bd-2388-4817-a862-146906fce050\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.336976 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-catalog-content\") pod \"fa5b65bd-2388-4817-a862-146906fce050\" (UID: \"fa5b65bd-2388-4817-a862-146906fce050\") " Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.337967 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-utilities" (OuterVolumeSpecName: "utilities") pod "fa5b65bd-2388-4817-a862-146906fce050" (UID: "fa5b65bd-2388-4817-a862-146906fce050"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.350379 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5b65bd-2388-4817-a862-146906fce050-kube-api-access-9g2kj" (OuterVolumeSpecName: "kube-api-access-9g2kj") pod "fa5b65bd-2388-4817-a862-146906fce050" (UID: "fa5b65bd-2388-4817-a862-146906fce050"). InnerVolumeSpecName "kube-api-access-9g2kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.389337 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa5b65bd-2388-4817-a862-146906fce050" (UID: "fa5b65bd-2388-4817-a862-146906fce050"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.439001 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g2kj\" (UniqueName: \"kubernetes.io/projected/fa5b65bd-2388-4817-a862-146906fce050-kube-api-access-9g2kj\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.439064 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.439077 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa5b65bd-2388-4817-a862-146906fce050-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.719124 4830 generic.go:334] "Generic (PLEG): container finished" podID="fa5b65bd-2388-4817-a862-146906fce050" containerID="c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8" exitCode=0 Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.719384 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jn8f" event={"ID":"fa5b65bd-2388-4817-a862-146906fce050","Type":"ContainerDied","Data":"c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8"} Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.719414 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8jn8f" event={"ID":"fa5b65bd-2388-4817-a862-146906fce050","Type":"ContainerDied","Data":"886364c872485a56f7cf05f3e80795fefed1c5ff74530a9b71b9cd7342c52885"} Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.719415 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8jn8f" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.719432 4830 scope.go:117] "RemoveContainer" containerID="c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.741093 4830 scope.go:117] "RemoveContainer" containerID="6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.767718 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8jn8f"] Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.779543 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8jn8f"] Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.789567 4830 scope.go:117] "RemoveContainer" containerID="9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.809561 4830 scope.go:117] "RemoveContainer" containerID="c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8" Mar 11 09:40:33 crc kubenswrapper[4830]: E0311 09:40:33.809955 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8\": container with ID starting with c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8 not found: ID does not exist" containerID="c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.809993 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8"} err="failed to get container status \"c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8\": rpc error: code = NotFound desc = could not find container \"c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8\": container with ID starting with c0855586bbb089fccd24830e264ab5b95716ec61b4c5770b92aaa77c52e2f9d8 not found: ID does not exist" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.810028 4830 scope.go:117] "RemoveContainer" containerID="6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910" Mar 11 09:40:33 crc kubenswrapper[4830]: E0311 09:40:33.810434 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910\": container with ID starting with 6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910 not found: ID does not exist" containerID="6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.810490 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910"} err="failed to get container status \"6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910\": rpc error: code = NotFound desc = could not find container \"6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910\": container with ID starting with 6712570d4e47a2d479855cebec28eddc30a08f9d30430b7d5033d7b95c2a9910 not found: ID does not exist" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.810531 4830 scope.go:117] "RemoveContainer" containerID="9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0" Mar 11 09:40:33 crc kubenswrapper[4830]: E0311 09:40:33.810868 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0\": container with ID starting with 9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0 not found: ID does not exist" containerID="9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0" Mar 11 09:40:33 crc kubenswrapper[4830]: I0311 09:40:33.810929 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0"} err="failed to get container status \"9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0\": rpc error: code = NotFound desc = could not find container \"9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0\": container with ID starting with 9c103ea2fdee82387673c390c72113653d8abad71b85a1d2cde22da2544318a0 not found: ID does not exist" Mar 11 09:40:34 crc kubenswrapper[4830]: I0311 09:40:34.951468 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5b65bd-2388-4817-a862-146906fce050" path="/var/lib/kubelet/pods/fa5b65bd-2388-4817-a862-146906fce050/volumes" Mar 11 09:40:40 crc kubenswrapper[4830]: I0311 09:40:40.934128 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:40:40 crc kubenswrapper[4830]: E0311 09:40:40.936059 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:40:53 crc kubenswrapper[4830]: I0311 09:40:53.933390 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:40:53 crc kubenswrapper[4830]: E0311 09:40:53.935271 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:41:06 crc kubenswrapper[4830]: I0311 09:41:06.933422 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:41:06 crc kubenswrapper[4830]: E0311 09:41:06.934173 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:41:09 crc kubenswrapper[4830]: I0311 09:41:09.236544 4830 scope.go:117] "RemoveContainer" containerID="154a84252a8c8c643706a35abf7adbec751e0ec395aa5f2f14476ddc0117fdca" Mar 11 09:41:09 crc kubenswrapper[4830]: I0311 09:41:09.314393 4830 scope.go:117] "RemoveContainer" containerID="323cfffe2a76ceb6926439e0777c22f7c6fb9dfdb6a2336bd6f147f6970ceb6c" Mar 11 09:41:09 crc kubenswrapper[4830]: I0311 09:41:09.340253 4830 scope.go:117] "RemoveContainer" containerID="6e1114c70394434abf4089f3b9d9141ed415f68925e14b3295c44bfcf213ae1e" Mar 11 09:41:09 crc kubenswrapper[4830]: I0311 09:41:09.399089 4830 scope.go:117] "RemoveContainer" containerID="8a0c2f44d5c787fc5a02e301efe54cf4145e529ba5c3b55b9e617dbf9b7b8003" Mar 11 09:41:09 crc kubenswrapper[4830]: I0311 09:41:09.443753 4830 scope.go:117] "RemoveContainer" containerID="b56d6f54832da7b7c2a6d9e1058daf4c7149770f50afb7cc2145d21ed8dc4dcc" Mar 11 09:41:09 crc kubenswrapper[4830]: I0311 09:41:09.540206 4830 scope.go:117] "RemoveContainer" containerID="db72a4b45d8609aa4b31c10f192a8c101361b5bbb2227097178de533a6c8c11e" Mar 11 09:41:17 crc kubenswrapper[4830]: I0311 09:41:17.932914 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:41:17 crc kubenswrapper[4830]: E0311 09:41:17.933784 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:41:29 crc kubenswrapper[4830]: I0311 09:41:29.932865 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:41:29 crc kubenswrapper[4830]: E0311 09:41:29.933651 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:41:43 crc kubenswrapper[4830]: I0311 09:41:43.933566 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:41:43 crc kubenswrapper[4830]: E0311 09:41:43.934435 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:41:58 crc kubenswrapper[4830]: I0311 09:41:58.933082 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:41:58 crc kubenswrapper[4830]: E0311 09:41:58.933900 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.150471 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553702-qk2gt"] Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.150937 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.150955 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.150974 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.150982 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151005 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151014 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151043 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151054 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151065 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151073 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151090 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151097 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151109 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151117 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151130 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151139 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151150 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151158 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151170 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151177 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="extract-utilities" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151194 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151201 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: E0311 09:42:00.151217 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151224 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="extract-content" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151470 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2635fd4f-dc7a-4524-bf2d-6307f49363c9" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151486 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5b65bd-2388-4817-a862-146906fce050" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151500 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="679a3e99-cdd1-4ae4-b4f5-e7b7ebe55efc" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.151515 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb1f2a22-9450-4112-9a20-d8b650cb410b" containerName="registry-server" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.152248 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553702-qk2gt" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.156076 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.156256 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.156368 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.160853 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553702-qk2gt"] Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.235941 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr6hq\" (UniqueName: \"kubernetes.io/projected/92bb55df-aad1-44a2-88f4-e61040246ef8-kube-api-access-fr6hq\") pod \"auto-csr-approver-29553702-qk2gt\" (UID: \"92bb55df-aad1-44a2-88f4-e61040246ef8\") " pod="openshift-infra/auto-csr-approver-29553702-qk2gt" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.337301 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr6hq\" (UniqueName: \"kubernetes.io/projected/92bb55df-aad1-44a2-88f4-e61040246ef8-kube-api-access-fr6hq\") pod \"auto-csr-approver-29553702-qk2gt\" (UID: \"92bb55df-aad1-44a2-88f4-e61040246ef8\") " pod="openshift-infra/auto-csr-approver-29553702-qk2gt" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.362627 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr6hq\" (UniqueName: \"kubernetes.io/projected/92bb55df-aad1-44a2-88f4-e61040246ef8-kube-api-access-fr6hq\") pod \"auto-csr-approver-29553702-qk2gt\" (UID: \"92bb55df-aad1-44a2-88f4-e61040246ef8\") " pod="openshift-infra/auto-csr-approver-29553702-qk2gt" Mar 11 09:42:00 crc kubenswrapper[4830]: I0311 09:42:00.475334 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553702-qk2gt" Mar 11 09:42:01 crc kubenswrapper[4830]: I0311 09:42:01.050330 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553702-qk2gt"] Mar 11 09:42:01 crc kubenswrapper[4830]: I0311 09:42:01.607498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553702-qk2gt" event={"ID":"92bb55df-aad1-44a2-88f4-e61040246ef8","Type":"ContainerStarted","Data":"fd0e0a8b9333d82eb539da71612de02b95e2e98bcfc799239566c81e60b64fdb"} Mar 11 09:42:03 crc kubenswrapper[4830]: I0311 09:42:03.628646 4830 generic.go:334] "Generic (PLEG): container finished" podID="92bb55df-aad1-44a2-88f4-e61040246ef8" containerID="61ece24803ef2b400cdff4a524abdc00b253867f21f2545cb593bcdda494db78" exitCode=0 Mar 11 09:42:03 crc kubenswrapper[4830]: I0311 09:42:03.629172 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553702-qk2gt" event={"ID":"92bb55df-aad1-44a2-88f4-e61040246ef8","Type":"ContainerDied","Data":"61ece24803ef2b400cdff4a524abdc00b253867f21f2545cb593bcdda494db78"} Mar 11 09:42:04 crc kubenswrapper[4830]: I0311 09:42:04.941564 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553702-qk2gt" Mar 11 09:42:05 crc kubenswrapper[4830]: I0311 09:42:05.038316 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr6hq\" (UniqueName: \"kubernetes.io/projected/92bb55df-aad1-44a2-88f4-e61040246ef8-kube-api-access-fr6hq\") pod \"92bb55df-aad1-44a2-88f4-e61040246ef8\" (UID: \"92bb55df-aad1-44a2-88f4-e61040246ef8\") " Mar 11 09:42:05 crc kubenswrapper[4830]: I0311 09:42:05.071095 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92bb55df-aad1-44a2-88f4-e61040246ef8-kube-api-access-fr6hq" (OuterVolumeSpecName: "kube-api-access-fr6hq") pod "92bb55df-aad1-44a2-88f4-e61040246ef8" (UID: "92bb55df-aad1-44a2-88f4-e61040246ef8"). InnerVolumeSpecName "kube-api-access-fr6hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:42:05 crc kubenswrapper[4830]: I0311 09:42:05.140765 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr6hq\" (UniqueName: \"kubernetes.io/projected/92bb55df-aad1-44a2-88f4-e61040246ef8-kube-api-access-fr6hq\") on node \"crc\" DevicePath \"\"" Mar 11 09:42:05 crc kubenswrapper[4830]: I0311 09:42:05.645750 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553702-qk2gt" event={"ID":"92bb55df-aad1-44a2-88f4-e61040246ef8","Type":"ContainerDied","Data":"fd0e0a8b9333d82eb539da71612de02b95e2e98bcfc799239566c81e60b64fdb"} Mar 11 09:42:05 crc kubenswrapper[4830]: I0311 09:42:05.645794 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0e0a8b9333d82eb539da71612de02b95e2e98bcfc799239566c81e60b64fdb" Mar 11 09:42:05 crc kubenswrapper[4830]: I0311 09:42:05.645834 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553702-qk2gt" Mar 11 09:42:06 crc kubenswrapper[4830]: I0311 09:42:06.028197 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553696-9lrt8"] Mar 11 09:42:06 crc kubenswrapper[4830]: I0311 09:42:06.040145 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553696-9lrt8"] Mar 11 09:42:06 crc kubenswrapper[4830]: I0311 09:42:06.946430 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04bdf020-11b4-4125-ad87-0a30df4278b9" path="/var/lib/kubelet/pods/04bdf020-11b4-4125-ad87-0a30df4278b9/volumes" Mar 11 09:42:09 crc kubenswrapper[4830]: I0311 09:42:09.745938 4830 scope.go:117] "RemoveContainer" containerID="ec23955f74aa6bd8b3869586d4a33d78135366dabb6184a74677c328953c2e1b" Mar 11 09:42:12 crc kubenswrapper[4830]: I0311 09:42:12.940657 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:42:12 crc kubenswrapper[4830]: E0311 09:42:12.941486 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:42:27 crc kubenswrapper[4830]: I0311 09:42:27.933057 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:42:27 crc kubenswrapper[4830]: E0311 09:42:27.933773 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:42:39 crc kubenswrapper[4830]: I0311 09:42:39.933002 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:42:39 crc kubenswrapper[4830]: E0311 09:42:39.933819 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:42:40 crc kubenswrapper[4830]: I0311 09:42:40.951289 4830 generic.go:334] "Generic (PLEG): container finished" podID="751133b2-5530-48d1-9cb0-4e69aadf979a" containerID="88054b8385a79bcda7195e05905b1f3fa5fcdaa57f23efbcc879d9b9f2c3e83f" exitCode=0 Mar 11 09:42:40 crc kubenswrapper[4830]: I0311 09:42:40.951387 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" event={"ID":"751133b2-5530-48d1-9cb0-4e69aadf979a","Type":"ContainerDied","Data":"88054b8385a79bcda7195e05905b1f3fa5fcdaa57f23efbcc879d9b9f2c3e83f"} Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.383761 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.461581 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6snb9\" (UniqueName: \"kubernetes.io/projected/751133b2-5530-48d1-9cb0-4e69aadf979a-kube-api-access-6snb9\") pod \"751133b2-5530-48d1-9cb0-4e69aadf979a\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.461691 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-bootstrap-combined-ca-bundle\") pod \"751133b2-5530-48d1-9cb0-4e69aadf979a\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.461849 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-inventory\") pod \"751133b2-5530-48d1-9cb0-4e69aadf979a\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.461903 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-ssh-key-openstack-edpm-ipam\") pod \"751133b2-5530-48d1-9cb0-4e69aadf979a\" (UID: \"751133b2-5530-48d1-9cb0-4e69aadf979a\") " Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.467855 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751133b2-5530-48d1-9cb0-4e69aadf979a-kube-api-access-6snb9" (OuterVolumeSpecName: "kube-api-access-6snb9") pod "751133b2-5530-48d1-9cb0-4e69aadf979a" (UID: "751133b2-5530-48d1-9cb0-4e69aadf979a"). InnerVolumeSpecName "kube-api-access-6snb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.468443 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "751133b2-5530-48d1-9cb0-4e69aadf979a" (UID: "751133b2-5530-48d1-9cb0-4e69aadf979a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.490116 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "751133b2-5530-48d1-9cb0-4e69aadf979a" (UID: "751133b2-5530-48d1-9cb0-4e69aadf979a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.491399 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-inventory" (OuterVolumeSpecName: "inventory") pod "751133b2-5530-48d1-9cb0-4e69aadf979a" (UID: "751133b2-5530-48d1-9cb0-4e69aadf979a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.564338 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6snb9\" (UniqueName: \"kubernetes.io/projected/751133b2-5530-48d1-9cb0-4e69aadf979a-kube-api-access-6snb9\") on node \"crc\" DevicePath \"\"" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.564821 4830 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.564838 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.564852 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/751133b2-5530-48d1-9cb0-4e69aadf979a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.969505 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" event={"ID":"751133b2-5530-48d1-9cb0-4e69aadf979a","Type":"ContainerDied","Data":"95eaf439e64d209d93af6ef296c470ae8b289e49238308d6da4d3261eaf6107f"} Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.969583 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95eaf439e64d209d93af6ef296c470ae8b289e49238308d6da4d3261eaf6107f" Mar 11 09:42:42 crc kubenswrapper[4830]: I0311 09:42:42.969593 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.053726 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5"] Mar 11 09:42:43 crc kubenswrapper[4830]: E0311 09:42:43.054108 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751133b2-5530-48d1-9cb0-4e69aadf979a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.054126 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="751133b2-5530-48d1-9cb0-4e69aadf979a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 11 09:42:43 crc kubenswrapper[4830]: E0311 09:42:43.054154 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92bb55df-aad1-44a2-88f4-e61040246ef8" containerName="oc" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.054160 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="92bb55df-aad1-44a2-88f4-e61040246ef8" containerName="oc" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.054321 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="751133b2-5530-48d1-9cb0-4e69aadf979a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.054339 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="92bb55df-aad1-44a2-88f4-e61040246ef8" containerName="oc" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.055057 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.057361 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.057516 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.058429 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.058838 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.119694 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5"] Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.175694 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.175761 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxhqf\" (UniqueName: \"kubernetes.io/projected/fae734f9-b26d-4252-b943-b09b3e235cfa-kube-api-access-pxhqf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.175801 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.278014 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxhqf\" (UniqueName: \"kubernetes.io/projected/fae734f9-b26d-4252-b943-b09b3e235cfa-kube-api-access-pxhqf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.278119 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.278292 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.282625 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.282646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.295289 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxhqf\" (UniqueName: \"kubernetes.io/projected/fae734f9-b26d-4252-b943-b09b3e235cfa-kube-api-access-pxhqf\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.422536 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:42:43 crc kubenswrapper[4830]: I0311 09:42:43.993537 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5"] Mar 11 09:42:44 crc kubenswrapper[4830]: I0311 09:42:44.989913 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" event={"ID":"fae734f9-b26d-4252-b943-b09b3e235cfa","Type":"ContainerStarted","Data":"4c9bccd9f50cc5046ca9f358ad3228637c16cdd2116cb3e86a88601dece37ee8"} Mar 11 09:42:46 crc kubenswrapper[4830]: I0311 09:42:46.000391 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" event={"ID":"fae734f9-b26d-4252-b943-b09b3e235cfa","Type":"ContainerStarted","Data":"1f7aad913b3cd71ef7975236dd26a58238cc4b5a777fd6f95e2a942f94c1005d"} Mar 11 09:42:46 crc kubenswrapper[4830]: I0311 09:42:46.023011 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" podStartSLOduration=2.250574282 podStartE2EDuration="3.022993175s" podCreationTimestamp="2026-03-11 09:42:43 +0000 UTC" firstStartedPulling="2026-03-11 09:42:44.003703716 +0000 UTC m=+1731.784854415" lastFinishedPulling="2026-03-11 09:42:44.776122599 +0000 UTC m=+1732.557273308" observedRunningTime="2026-03-11 09:42:46.015636732 +0000 UTC m=+1733.796787421" watchObservedRunningTime="2026-03-11 09:42:46.022993175 +0000 UTC m=+1733.804143864" Mar 11 09:42:51 crc kubenswrapper[4830]: I0311 09:42:51.933714 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:42:51 crc kubenswrapper[4830]: E0311 09:42:51.934766 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:43:04 crc kubenswrapper[4830]: I0311 09:43:04.933436 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:43:04 crc kubenswrapper[4830]: E0311 09:43:04.934203 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:43:09 crc kubenswrapper[4830]: I0311 09:43:09.883885 4830 scope.go:117] "RemoveContainer" containerID="6b8eb712f98e7f7c153f5ee3fe37d54c8457ea4ddff207f832dde3bb84ecaa9e" Mar 11 09:43:09 crc kubenswrapper[4830]: I0311 09:43:09.918805 4830 scope.go:117] "RemoveContainer" containerID="9f7699a8975d2876363c67476e644a5ef45ba1160b67faaad9196da93584601f" Mar 11 09:43:09 crc kubenswrapper[4830]: I0311 09:43:09.947479 4830 scope.go:117] "RemoveContainer" containerID="7b1c25d3125d8e397653d1b870709aeb71fc9b83e76f2ddf62567f90c902d105" Mar 11 09:43:09 crc kubenswrapper[4830]: I0311 09:43:09.969211 4830 scope.go:117] "RemoveContainer" containerID="acb320b4280fed5ec1050ed40d9719952f4b6f7144d5345114350863442fb9d4" Mar 11 09:43:10 crc kubenswrapper[4830]: I0311 09:43:10.028421 4830 scope.go:117] "RemoveContainer" containerID="539360b564f81f0d337c6d6f03c266ad668f86480266c5cc6f96ab141e110887" Mar 11 09:43:19 crc kubenswrapper[4830]: I0311 09:43:19.932488 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:43:19 crc kubenswrapper[4830]: E0311 09:43:19.933202 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:43:34 crc kubenswrapper[4830]: I0311 09:43:34.932114 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:43:34 crc kubenswrapper[4830]: E0311 09:43:34.932871 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:43:45 crc kubenswrapper[4830]: I0311 09:43:45.935549 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:43:45 crc kubenswrapper[4830]: E0311 09:43:45.936620 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:43:52 crc kubenswrapper[4830]: I0311 09:43:52.048324 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c667-account-create-update-cpknz"] Mar 11 09:43:52 crc kubenswrapper[4830]: I0311 09:43:52.061928 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-p7nmp"] Mar 11 09:43:52 crc kubenswrapper[4830]: I0311 09:43:52.072197 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c667-account-create-update-cpknz"] Mar 11 09:43:52 crc kubenswrapper[4830]: I0311 09:43:52.081053 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-p7nmp"] Mar 11 09:43:52 crc kubenswrapper[4830]: I0311 09:43:52.948053 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b71ab87-579d-41fb-b094-92cc4e3fe5b6" path="/var/lib/kubelet/pods/0b71ab87-579d-41fb-b094-92cc4e3fe5b6/volumes" Mar 11 09:43:52 crc kubenswrapper[4830]: I0311 09:43:52.953813 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42ddef2-86d1-435e-b240-f159db41a2d2" path="/var/lib/kubelet/pods/a42ddef2-86d1-435e-b240-f159db41a2d2/volumes" Mar 11 09:43:53 crc kubenswrapper[4830]: I0311 09:43:53.039527 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mmh84"] Mar 11 09:43:53 crc kubenswrapper[4830]: I0311 09:43:53.054049 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dd45-account-create-update-8gxtl"] Mar 11 09:43:53 crc kubenswrapper[4830]: I0311 09:43:53.067525 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mmh84"] Mar 11 09:43:53 crc kubenswrapper[4830]: I0311 09:43:53.075893 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-dd45-account-create-update-8gxtl"] Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.047333 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-125c-account-create-update-rnfqn"] Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.067224 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-nvf88"] Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.077157 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-125c-account-create-update-rnfqn"] Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.086452 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-nvf88"] Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.968684 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a18449-bdae-4eab-a9f8-816cf3064929" path="/var/lib/kubelet/pods/76a18449-bdae-4eab-a9f8-816cf3064929/volumes" Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.971935 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b" path="/var/lib/kubelet/pods/88ea8a5a-b51d-4cec-a7b6-c83a3fdc428b/volumes" Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.975163 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6497d6f-b364-4d49-8825-08dd005ef1a1" path="/var/lib/kubelet/pods/d6497d6f-b364-4d49-8825-08dd005ef1a1/volumes" Mar 11 09:43:54 crc kubenswrapper[4830]: I0311 09:43:54.977671 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f302b2-d44c-4598-89d7-1592d4e14f6d" path="/var/lib/kubelet/pods/f8f302b2-d44c-4598-89d7-1592d4e14f6d/volumes" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.166871 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553704-n9t6z"] Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.169622 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553704-n9t6z" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.174271 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.174524 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.174716 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.189611 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553704-n9t6z"] Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.214384 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66g4c\" (UniqueName: \"kubernetes.io/projected/f6fbeba2-4a06-4727-9bae-8470dc0b1c4e-kube-api-access-66g4c\") pod \"auto-csr-approver-29553704-n9t6z\" (UID: \"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e\") " pod="openshift-infra/auto-csr-approver-29553704-n9t6z" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.316257 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66g4c\" (UniqueName: \"kubernetes.io/projected/f6fbeba2-4a06-4727-9bae-8470dc0b1c4e-kube-api-access-66g4c\") pod \"auto-csr-approver-29553704-n9t6z\" (UID: \"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e\") " pod="openshift-infra/auto-csr-approver-29553704-n9t6z" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.338305 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66g4c\" (UniqueName: \"kubernetes.io/projected/f6fbeba2-4a06-4727-9bae-8470dc0b1c4e-kube-api-access-66g4c\") pod \"auto-csr-approver-29553704-n9t6z\" (UID: \"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e\") " pod="openshift-infra/auto-csr-approver-29553704-n9t6z" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.503444 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553704-n9t6z" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.932908 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:44:00 crc kubenswrapper[4830]: E0311 09:44:00.934234 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:44:00 crc kubenswrapper[4830]: I0311 09:44:00.978363 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553704-n9t6z"] Mar 11 09:44:00 crc kubenswrapper[4830]: W0311 09:44:00.989127 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6fbeba2_4a06_4727_9bae_8470dc0b1c4e.slice/crio-b2ddfee3d873945606384f921a73440ce676eaa78819a5e1bc69b70e3565c076 WatchSource:0}: Error finding container b2ddfee3d873945606384f921a73440ce676eaa78819a5e1bc69b70e3565c076: Status 404 returned error can't find the container with id b2ddfee3d873945606384f921a73440ce676eaa78819a5e1bc69b70e3565c076 Mar 11 09:44:01 crc kubenswrapper[4830]: I0311 09:44:01.736796 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553704-n9t6z" event={"ID":"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e","Type":"ContainerStarted","Data":"b2ddfee3d873945606384f921a73440ce676eaa78819a5e1bc69b70e3565c076"} Mar 11 09:44:02 crc kubenswrapper[4830]: I0311 09:44:02.747619 4830 generic.go:334] "Generic (PLEG): container finished" podID="f6fbeba2-4a06-4727-9bae-8470dc0b1c4e" containerID="ca9ed8e579f7430a790d3a430c56bdb363dc32dedc87ac44b3208285e8a1a731" exitCode=0 Mar 11 09:44:02 crc kubenswrapper[4830]: I0311 09:44:02.747753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553704-n9t6z" event={"ID":"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e","Type":"ContainerDied","Data":"ca9ed8e579f7430a790d3a430c56bdb363dc32dedc87ac44b3208285e8a1a731"} Mar 11 09:44:04 crc kubenswrapper[4830]: I0311 09:44:04.076389 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553704-n9t6z" Mar 11 09:44:04 crc kubenswrapper[4830]: I0311 09:44:04.108975 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66g4c\" (UniqueName: \"kubernetes.io/projected/f6fbeba2-4a06-4727-9bae-8470dc0b1c4e-kube-api-access-66g4c\") pod \"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e\" (UID: \"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e\") " Mar 11 09:44:04 crc kubenswrapper[4830]: I0311 09:44:04.114815 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fbeba2-4a06-4727-9bae-8470dc0b1c4e-kube-api-access-66g4c" (OuterVolumeSpecName: "kube-api-access-66g4c") pod "f6fbeba2-4a06-4727-9bae-8470dc0b1c4e" (UID: "f6fbeba2-4a06-4727-9bae-8470dc0b1c4e"). InnerVolumeSpecName "kube-api-access-66g4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:44:04 crc kubenswrapper[4830]: I0311 09:44:04.212291 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66g4c\" (UniqueName: \"kubernetes.io/projected/f6fbeba2-4a06-4727-9bae-8470dc0b1c4e-kube-api-access-66g4c\") on node \"crc\" DevicePath \"\"" Mar 11 09:44:04 crc kubenswrapper[4830]: I0311 09:44:04.775912 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553704-n9t6z" event={"ID":"f6fbeba2-4a06-4727-9bae-8470dc0b1c4e","Type":"ContainerDied","Data":"b2ddfee3d873945606384f921a73440ce676eaa78819a5e1bc69b70e3565c076"} Mar 11 09:44:04 crc kubenswrapper[4830]: I0311 09:44:04.775982 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2ddfee3d873945606384f921a73440ce676eaa78819a5e1bc69b70e3565c076" Mar 11 09:44:04 crc kubenswrapper[4830]: I0311 09:44:04.776161 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553704-n9t6z" Mar 11 09:44:05 crc kubenswrapper[4830]: I0311 09:44:05.140230 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553698-9mp47"] Mar 11 09:44:05 crc kubenswrapper[4830]: I0311 09:44:05.150039 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553698-9mp47"] Mar 11 09:44:05 crc kubenswrapper[4830]: I0311 09:44:05.786697 4830 generic.go:334] "Generic (PLEG): container finished" podID="fae734f9-b26d-4252-b943-b09b3e235cfa" containerID="1f7aad913b3cd71ef7975236dd26a58238cc4b5a777fd6f95e2a942f94c1005d" exitCode=0 Mar 11 09:44:05 crc kubenswrapper[4830]: I0311 09:44:05.786746 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" event={"ID":"fae734f9-b26d-4252-b943-b09b3e235cfa","Type":"ContainerDied","Data":"1f7aad913b3cd71ef7975236dd26a58238cc4b5a777fd6f95e2a942f94c1005d"} Mar 11 09:44:06 crc kubenswrapper[4830]: I0311 09:44:06.942805 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c94f958-2a99-43d5-bee6-f5e2c9ed9e80" path="/var/lib/kubelet/pods/8c94f958-2a99-43d5-bee6-f5e2c9ed9e80/volumes" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.134182 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.165728 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-ssh-key-openstack-edpm-ipam\") pod \"fae734f9-b26d-4252-b943-b09b3e235cfa\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.165819 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxhqf\" (UniqueName: \"kubernetes.io/projected/fae734f9-b26d-4252-b943-b09b3e235cfa-kube-api-access-pxhqf\") pod \"fae734f9-b26d-4252-b943-b09b3e235cfa\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.165974 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-inventory\") pod \"fae734f9-b26d-4252-b943-b09b3e235cfa\" (UID: \"fae734f9-b26d-4252-b943-b09b3e235cfa\") " Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.171301 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae734f9-b26d-4252-b943-b09b3e235cfa-kube-api-access-pxhqf" (OuterVolumeSpecName: "kube-api-access-pxhqf") pod "fae734f9-b26d-4252-b943-b09b3e235cfa" (UID: "fae734f9-b26d-4252-b943-b09b3e235cfa"). InnerVolumeSpecName "kube-api-access-pxhqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.193186 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fae734f9-b26d-4252-b943-b09b3e235cfa" (UID: "fae734f9-b26d-4252-b943-b09b3e235cfa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.194525 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-inventory" (OuterVolumeSpecName: "inventory") pod "fae734f9-b26d-4252-b943-b09b3e235cfa" (UID: "fae734f9-b26d-4252-b943-b09b3e235cfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.267666 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.267712 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxhqf\" (UniqueName: \"kubernetes.io/projected/fae734f9-b26d-4252-b943-b09b3e235cfa-kube-api-access-pxhqf\") on node \"crc\" DevicePath \"\"" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.267723 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fae734f9-b26d-4252-b943-b09b3e235cfa-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.813346 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" event={"ID":"fae734f9-b26d-4252-b943-b09b3e235cfa","Type":"ContainerDied","Data":"4c9bccd9f50cc5046ca9f358ad3228637c16cdd2116cb3e86a88601dece37ee8"} Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.813388 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c9bccd9f50cc5046ca9f358ad3228637c16cdd2116cb3e86a88601dece37ee8" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.813621 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.897447 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx"] Mar 11 09:44:07 crc kubenswrapper[4830]: E0311 09:44:07.898050 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae734f9-b26d-4252-b943-b09b3e235cfa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.898072 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae734f9-b26d-4252-b943-b09b3e235cfa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 11 09:44:07 crc kubenswrapper[4830]: E0311 09:44:07.898096 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fbeba2-4a06-4727-9bae-8470dc0b1c4e" containerName="oc" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.898105 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fbeba2-4a06-4727-9bae-8470dc0b1c4e" containerName="oc" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.898396 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6fbeba2-4a06-4727-9bae-8470dc0b1c4e" containerName="oc" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.898411 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae734f9-b26d-4252-b943-b09b3e235cfa" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.900110 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.902983 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.903196 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.903341 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.903557 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.913063 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx"] Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.979044 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.979170 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229r6\" (UniqueName: \"kubernetes.io/projected/9711332d-adac-4289-81e4-686135601f68-kube-api-access-229r6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:07 crc kubenswrapper[4830]: I0311 09:44:07.979664 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.082814 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.083073 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-229r6\" (UniqueName: \"kubernetes.io/projected/9711332d-adac-4289-81e4-686135601f68-kube-api-access-229r6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.083200 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.087372 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.087437 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.103845 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-229r6\" (UniqueName: \"kubernetes.io/projected/9711332d-adac-4289-81e4-686135601f68-kube-api-access-229r6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.280103 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:44:08 crc kubenswrapper[4830]: I0311 09:44:08.823557 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx"] Mar 11 09:44:09 crc kubenswrapper[4830]: I0311 09:44:09.838369 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" event={"ID":"9711332d-adac-4289-81e4-686135601f68","Type":"ContainerStarted","Data":"586ef0827323ca422e05bbfaf5a2ca765e0ad19721d361f206fe2891fef2abd3"} Mar 11 09:44:09 crc kubenswrapper[4830]: I0311 09:44:09.838909 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" event={"ID":"9711332d-adac-4289-81e4-686135601f68","Type":"ContainerStarted","Data":"2726e2a00d5eef6111adc561bedf890eeafea445d0e6519dffa68f38fda6c00b"} Mar 11 09:44:09 crc kubenswrapper[4830]: I0311 09:44:09.861918 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" podStartSLOduration=2.190205322 podStartE2EDuration="2.861892326s" podCreationTimestamp="2026-03-11 09:44:07 +0000 UTC" firstStartedPulling="2026-03-11 09:44:08.828901677 +0000 UTC m=+1816.610052386" lastFinishedPulling="2026-03-11 09:44:09.500588691 +0000 UTC m=+1817.281739390" observedRunningTime="2026-03-11 09:44:09.855305417 +0000 UTC m=+1817.636456106" watchObservedRunningTime="2026-03-11 09:44:09.861892326 +0000 UTC m=+1817.643043055" Mar 11 09:44:10 crc kubenswrapper[4830]: I0311 09:44:10.142383 4830 scope.go:117] "RemoveContainer" containerID="78d9cabea547fb73b5971d1d293cd333fe8bc2c62af293ba0d803ed1337258be" Mar 11 09:44:10 crc kubenswrapper[4830]: I0311 09:44:10.181656 4830 scope.go:117] "RemoveContainer" containerID="f6a94db2e611365246e47462983d5ebdd1beadbe5c1b6c45eecb86efb40ba8e1" Mar 11 09:44:10 crc kubenswrapper[4830]: I0311 09:44:10.215879 4830 scope.go:117] "RemoveContainer" containerID="969c747b2572a12bbb9f3852d36e59f19d952563c3924cb9adb87ac12691f447" Mar 11 09:44:10 crc kubenswrapper[4830]: I0311 09:44:10.270080 4830 scope.go:117] "RemoveContainer" containerID="2b73748f10aa4dae27415bfa62deab42b8e1b2593deec5e6d6ae2eae42751d59" Mar 11 09:44:10 crc kubenswrapper[4830]: I0311 09:44:10.301068 4830 scope.go:117] "RemoveContainer" containerID="f5853cc6f292f2f21212e4bae7ef4a727a593062e600d0aa5a83510a873757ca" Mar 11 09:44:10 crc kubenswrapper[4830]: I0311 09:44:10.322540 4830 scope.go:117] "RemoveContainer" containerID="68135fe04ab55fcdbbc689c20b0ea6fe2184e13647d560a19d3c5234c6b72348" Mar 11 09:44:10 crc kubenswrapper[4830]: I0311 09:44:10.361959 4830 scope.go:117] "RemoveContainer" containerID="e88371dc46ca1125d9d23340678b6e346e1f5327d6b895fe1455cd7c6fa7e8fc" Mar 11 09:44:11 crc kubenswrapper[4830]: I0311 09:44:11.030946 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-x6g8g"] Mar 11 09:44:11 crc kubenswrapper[4830]: I0311 09:44:11.041996 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-x6g8g"] Mar 11 09:44:12 crc kubenswrapper[4830]: I0311 09:44:12.945640 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed47348-1ba4-41ad-a417-7b9c55bdb1f2" path="/var/lib/kubelet/pods/bed47348-1ba4-41ad-a417-7b9c55bdb1f2/volumes" Mar 11 09:44:13 crc kubenswrapper[4830]: I0311 09:44:13.933118 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:44:13 crc kubenswrapper[4830]: E0311 09:44:13.933514 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:44:20 crc kubenswrapper[4830]: I0311 09:44:20.055867 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8cx6g"] Mar 11 09:44:20 crc kubenswrapper[4830]: I0311 09:44:20.068920 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8cx6g"] Mar 11 09:44:20 crc kubenswrapper[4830]: I0311 09:44:20.945477 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b917ac6a-dcd3-46e7-b4a5-65e7a5622959" path="/var/lib/kubelet/pods/b917ac6a-dcd3-46e7-b4a5-65e7a5622959/volumes" Mar 11 09:44:23 crc kubenswrapper[4830]: I0311 09:44:23.033597 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-sl9ll"] Mar 11 09:44:23 crc kubenswrapper[4830]: I0311 09:44:23.041460 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cx6qq"] Mar 11 09:44:23 crc kubenswrapper[4830]: I0311 09:44:23.050341 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-sl9ll"] Mar 11 09:44:23 crc kubenswrapper[4830]: I0311 09:44:23.058209 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cx6qq"] Mar 11 09:44:24 crc kubenswrapper[4830]: I0311 09:44:24.945488 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="084f4a11-5159-49ca-b836-125e788f09e4" path="/var/lib/kubelet/pods/084f4a11-5159-49ca-b836-125e788f09e4/volumes" Mar 11 09:44:24 crc kubenswrapper[4830]: I0311 09:44:24.947941 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7808b223-41ad-41c0-96f6-d7434ce65017" path="/var/lib/kubelet/pods/7808b223-41ad-41c0-96f6-d7434ce65017/volumes" Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.035166 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-62cb-account-create-update-gdbqf"] Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.046618 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-923e-account-create-update-rfpxl"] Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.057515 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-aea4-account-create-update-jvpsd"] Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.066783 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-62cb-account-create-update-gdbqf"] Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.075880 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-923e-account-create-update-rfpxl"] Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.084326 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-aea4-account-create-update-jvpsd"] Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.092588 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8kcdv"] Mar 11 09:44:27 crc kubenswrapper[4830]: I0311 09:44:27.099762 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8kcdv"] Mar 11 09:44:28 crc kubenswrapper[4830]: I0311 09:44:28.933091 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:44:28 crc kubenswrapper[4830]: E0311 09:44:28.933348 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:44:28 crc kubenswrapper[4830]: I0311 09:44:28.943101 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1089ecc0-fd27-494f-905f-a4cd117e66dd" path="/var/lib/kubelet/pods/1089ecc0-fd27-494f-905f-a4cd117e66dd/volumes" Mar 11 09:44:28 crc kubenswrapper[4830]: I0311 09:44:28.944371 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d" path="/var/lib/kubelet/pods/1dcd19b1-d63e-4ce1-b4d9-82d3e257b83d/volumes" Mar 11 09:44:28 crc kubenswrapper[4830]: I0311 09:44:28.945416 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96185ebd-9440-48a9-b1b6-0674ab5d4bb5" path="/var/lib/kubelet/pods/96185ebd-9440-48a9-b1b6-0674ab5d4bb5/volumes" Mar 11 09:44:28 crc kubenswrapper[4830]: I0311 09:44:28.946444 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7738582-0c74-4445-9939-be3ca1ffeab5" path="/var/lib/kubelet/pods/f7738582-0c74-4445-9939-be3ca1ffeab5/volumes" Mar 11 09:44:32 crc kubenswrapper[4830]: I0311 09:44:32.027715 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-tqb6c"] Mar 11 09:44:32 crc kubenswrapper[4830]: I0311 09:44:32.035970 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-tqb6c"] Mar 11 09:44:32 crc kubenswrapper[4830]: I0311 09:44:32.941967 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c5179b-9c10-4987-88cf-ba72ee746480" path="/var/lib/kubelet/pods/d8c5179b-9c10-4987-88cf-ba72ee746480/volumes" Mar 11 09:44:41 crc kubenswrapper[4830]: I0311 09:44:41.934493 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:44:41 crc kubenswrapper[4830]: E0311 09:44:41.936322 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:44:56 crc kubenswrapper[4830]: I0311 09:44:56.933204 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:44:56 crc kubenswrapper[4830]: E0311 09:44:56.934920 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.154591 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd"] Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.156387 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.159377 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.159628 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.165993 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd"] Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.205396 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af93a25a-efc2-4d39-b5f0-e351484226c5-config-volume\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.205862 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhc5f\" (UniqueName: \"kubernetes.io/projected/af93a25a-efc2-4d39-b5f0-e351484226c5-kube-api-access-nhc5f\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.205948 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af93a25a-efc2-4d39-b5f0-e351484226c5-secret-volume\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.307930 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af93a25a-efc2-4d39-b5f0-e351484226c5-config-volume\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.308002 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhc5f\" (UniqueName: \"kubernetes.io/projected/af93a25a-efc2-4d39-b5f0-e351484226c5-kube-api-access-nhc5f\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.308053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af93a25a-efc2-4d39-b5f0-e351484226c5-secret-volume\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.308835 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af93a25a-efc2-4d39-b5f0-e351484226c5-config-volume\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.315856 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af93a25a-efc2-4d39-b5f0-e351484226c5-secret-volume\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.324888 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhc5f\" (UniqueName: \"kubernetes.io/projected/af93a25a-efc2-4d39-b5f0-e351484226c5-kube-api-access-nhc5f\") pod \"collect-profiles-29553705-fc2fd\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.480574 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:00 crc kubenswrapper[4830]: I0311 09:45:00.919447 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd"] Mar 11 09:45:01 crc kubenswrapper[4830]: I0311 09:45:01.298931 4830 generic.go:334] "Generic (PLEG): container finished" podID="af93a25a-efc2-4d39-b5f0-e351484226c5" containerID="a9d137a706ca7c58edd8cc1260af672fb9da63a6817dac89631a74ddb151e0e4" exitCode=0 Mar 11 09:45:01 crc kubenswrapper[4830]: I0311 09:45:01.299028 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" event={"ID":"af93a25a-efc2-4d39-b5f0-e351484226c5","Type":"ContainerDied","Data":"a9d137a706ca7c58edd8cc1260af672fb9da63a6817dac89631a74ddb151e0e4"} Mar 11 09:45:01 crc kubenswrapper[4830]: I0311 09:45:01.299301 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" event={"ID":"af93a25a-efc2-4d39-b5f0-e351484226c5","Type":"ContainerStarted","Data":"ff203dec6cebb922769f9a445813a6dc1e7cc1c5084356a5f06e8121de3e63f0"} Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.625404 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.650260 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af93a25a-efc2-4d39-b5f0-e351484226c5-config-volume\") pod \"af93a25a-efc2-4d39-b5f0-e351484226c5\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.650343 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af93a25a-efc2-4d39-b5f0-e351484226c5-secret-volume\") pod \"af93a25a-efc2-4d39-b5f0-e351484226c5\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.650415 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhc5f\" (UniqueName: \"kubernetes.io/projected/af93a25a-efc2-4d39-b5f0-e351484226c5-kube-api-access-nhc5f\") pod \"af93a25a-efc2-4d39-b5f0-e351484226c5\" (UID: \"af93a25a-efc2-4d39-b5f0-e351484226c5\") " Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.651040 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af93a25a-efc2-4d39-b5f0-e351484226c5-config-volume" (OuterVolumeSpecName: "config-volume") pod "af93a25a-efc2-4d39-b5f0-e351484226c5" (UID: "af93a25a-efc2-4d39-b5f0-e351484226c5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.657237 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af93a25a-efc2-4d39-b5f0-e351484226c5-kube-api-access-nhc5f" (OuterVolumeSpecName: "kube-api-access-nhc5f") pod "af93a25a-efc2-4d39-b5f0-e351484226c5" (UID: "af93a25a-efc2-4d39-b5f0-e351484226c5"). InnerVolumeSpecName "kube-api-access-nhc5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.659564 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af93a25a-efc2-4d39-b5f0-e351484226c5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "af93a25a-efc2-4d39-b5f0-e351484226c5" (UID: "af93a25a-efc2-4d39-b5f0-e351484226c5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.752974 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhc5f\" (UniqueName: \"kubernetes.io/projected/af93a25a-efc2-4d39-b5f0-e351484226c5-kube-api-access-nhc5f\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.753055 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af93a25a-efc2-4d39-b5f0-e351484226c5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:02 crc kubenswrapper[4830]: I0311 09:45:02.753065 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/af93a25a-efc2-4d39-b5f0-e351484226c5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:03 crc kubenswrapper[4830]: I0311 09:45:03.315713 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" event={"ID":"af93a25a-efc2-4d39-b5f0-e351484226c5","Type":"ContainerDied","Data":"ff203dec6cebb922769f9a445813a6dc1e7cc1c5084356a5f06e8121de3e63f0"} Mar 11 09:45:03 crc kubenswrapper[4830]: I0311 09:45:03.315752 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff203dec6cebb922769f9a445813a6dc1e7cc1c5084356a5f06e8121de3e63f0" Mar 11 09:45:03 crc kubenswrapper[4830]: I0311 09:45:03.315841 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553705-fc2fd" Mar 11 09:45:08 crc kubenswrapper[4830]: I0311 09:45:08.933395 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:45:08 crc kubenswrapper[4830]: E0311 09:45:08.934263 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.515830 4830 scope.go:117] "RemoveContainer" containerID="2a1098b6416bb8e6490be29e9e9f3e41dbec751bef400305e9eb19fba004dde2" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.547132 4830 scope.go:117] "RemoveContainer" containerID="c2bbbf9d018f0d46fe420e6e794667233782c051f9b4092216561b0d91f7fa0b" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.598899 4830 scope.go:117] "RemoveContainer" containerID="67fe025f08ec4f09f32650547a8fbea269b7b6a32235514c5e9df457df319df0" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.635936 4830 scope.go:117] "RemoveContainer" containerID="742a45d9ce3426284be6aff9e9444f6ecbccc966ce3e49259d63587c7653e445" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.689807 4830 scope.go:117] "RemoveContainer" containerID="252d8d3c729a9b19746159a6e4d6ff0f149bcd7841396e2f189f931a6e18fe2f" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.724837 4830 scope.go:117] "RemoveContainer" containerID="16eae6fccdfcb7fd43b09153a9a055ce2d03e9092b8ddf4129f1d48d361561e5" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.810291 4830 scope.go:117] "RemoveContainer" containerID="b475842f9f3725012fc6b65e111cdf6b68a11b01e30facf22311a5c6239167f7" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.840061 4830 scope.go:117] "RemoveContainer" containerID="c513882a6021e53a245d11059a47ae8bee641309fcbd1737a370fccbcef71d3d" Mar 11 09:45:10 crc kubenswrapper[4830]: I0311 09:45:10.871435 4830 scope.go:117] "RemoveContainer" containerID="4d9607bce44ec9b425d982614df1376eea12961ef4cbaf3f9b8512ab32f5133f" Mar 11 09:45:13 crc kubenswrapper[4830]: I0311 09:45:13.047086 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-p47hl"] Mar 11 09:45:13 crc kubenswrapper[4830]: I0311 09:45:13.057159 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-p47hl"] Mar 11 09:45:14 crc kubenswrapper[4830]: I0311 09:45:14.943479 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdcc3064-6041-40ab-b12e-6ca3f6bd6884" path="/var/lib/kubelet/pods/fdcc3064-6041-40ab-b12e-6ca3f6bd6884/volumes" Mar 11 09:45:15 crc kubenswrapper[4830]: I0311 09:45:15.431963 4830 generic.go:334] "Generic (PLEG): container finished" podID="9711332d-adac-4289-81e4-686135601f68" containerID="586ef0827323ca422e05bbfaf5a2ca765e0ad19721d361f206fe2891fef2abd3" exitCode=0 Mar 11 09:45:15 crc kubenswrapper[4830]: I0311 09:45:15.432012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" event={"ID":"9711332d-adac-4289-81e4-686135601f68","Type":"ContainerDied","Data":"586ef0827323ca422e05bbfaf5a2ca765e0ad19721d361f206fe2891fef2abd3"} Mar 11 09:45:16 crc kubenswrapper[4830]: I0311 09:45:16.820689 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:45:16 crc kubenswrapper[4830]: I0311 09:45:16.938152 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-inventory\") pod \"9711332d-adac-4289-81e4-686135601f68\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " Mar 11 09:45:16 crc kubenswrapper[4830]: I0311 09:45:16.938191 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-ssh-key-openstack-edpm-ipam\") pod \"9711332d-adac-4289-81e4-686135601f68\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " Mar 11 09:45:16 crc kubenswrapper[4830]: I0311 09:45:16.938307 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-229r6\" (UniqueName: \"kubernetes.io/projected/9711332d-adac-4289-81e4-686135601f68-kube-api-access-229r6\") pod \"9711332d-adac-4289-81e4-686135601f68\" (UID: \"9711332d-adac-4289-81e4-686135601f68\") " Mar 11 09:45:16 crc kubenswrapper[4830]: I0311 09:45:16.945155 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9711332d-adac-4289-81e4-686135601f68-kube-api-access-229r6" (OuterVolumeSpecName: "kube-api-access-229r6") pod "9711332d-adac-4289-81e4-686135601f68" (UID: "9711332d-adac-4289-81e4-686135601f68"). InnerVolumeSpecName "kube-api-access-229r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:45:16 crc kubenswrapper[4830]: I0311 09:45:16.983745 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9711332d-adac-4289-81e4-686135601f68" (UID: "9711332d-adac-4289-81e4-686135601f68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:45:16 crc kubenswrapper[4830]: I0311 09:45:16.998958 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-inventory" (OuterVolumeSpecName: "inventory") pod "9711332d-adac-4289-81e4-686135601f68" (UID: "9711332d-adac-4289-81e4-686135601f68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.040977 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.041038 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9711332d-adac-4289-81e4-686135601f68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.041057 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-229r6\" (UniqueName: \"kubernetes.io/projected/9711332d-adac-4289-81e4-686135601f68-kube-api-access-229r6\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.455557 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" event={"ID":"9711332d-adac-4289-81e4-686135601f68","Type":"ContainerDied","Data":"2726e2a00d5eef6111adc561bedf890eeafea445d0e6519dffa68f38fda6c00b"} Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.455634 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2726e2a00d5eef6111adc561bedf890eeafea445d0e6519dffa68f38fda6c00b" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.455654 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.535766 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8"] Mar 11 09:45:17 crc kubenswrapper[4830]: E0311 09:45:17.536512 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9711332d-adac-4289-81e4-686135601f68" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.536611 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9711332d-adac-4289-81e4-686135601f68" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 11 09:45:17 crc kubenswrapper[4830]: E0311 09:45:17.536712 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af93a25a-efc2-4d39-b5f0-e351484226c5" containerName="collect-profiles" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.536793 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="af93a25a-efc2-4d39-b5f0-e351484226c5" containerName="collect-profiles" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.537046 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="af93a25a-efc2-4d39-b5f0-e351484226c5" containerName="collect-profiles" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.537127 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9711332d-adac-4289-81e4-686135601f68" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.537797 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.540458 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.540747 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.540977 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.542451 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.547598 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8"] Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.667415 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.667723 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.667838 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcrb5\" (UniqueName: \"kubernetes.io/projected/4d5247a6-36f4-4260-88bd-659f66f5efc0-kube-api-access-wcrb5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.771627 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.771749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.771908 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcrb5\" (UniqueName: \"kubernetes.io/projected/4d5247a6-36f4-4260-88bd-659f66f5efc0-kube-api-access-wcrb5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.777000 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.777040 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.794815 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcrb5\" (UniqueName: \"kubernetes.io/projected/4d5247a6-36f4-4260-88bd-659f66f5efc0-kube-api-access-wcrb5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:17 crc kubenswrapper[4830]: I0311 09:45:17.860924 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.034987 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fwspb"] Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.044361 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2wcrf"] Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.055719 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fwspb"] Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.063348 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2wcrf"] Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.419245 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8"] Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.433354 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.466774 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" event={"ID":"4d5247a6-36f4-4260-88bd-659f66f5efc0","Type":"ContainerStarted","Data":"3d961dc9b997e685d586a489ccd197d7ddc7c98f44616e31625a2a25630f8ba3"} Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.945332 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253897f0-4649-46c8-9bb3-9d25a4864701" path="/var/lib/kubelet/pods/253897f0-4649-46c8-9bb3-9d25a4864701/volumes" Mar 11 09:45:18 crc kubenswrapper[4830]: I0311 09:45:18.948536 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9713cf71-536f-4674-8184-7c7651dad952" path="/var/lib/kubelet/pods/9713cf71-536f-4674-8184-7c7651dad952/volumes" Mar 11 09:45:19 crc kubenswrapper[4830]: I0311 09:45:19.479028 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" event={"ID":"4d5247a6-36f4-4260-88bd-659f66f5efc0","Type":"ContainerStarted","Data":"5e58459ae4c71cb4501b377ade436c191e30dbd6117c0ca05bce21859b007302"} Mar 11 09:45:19 crc kubenswrapper[4830]: I0311 09:45:19.501812 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" podStartSLOduration=2.034543887 podStartE2EDuration="2.501788539s" podCreationTimestamp="2026-03-11 09:45:17 +0000 UTC" firstStartedPulling="2026-03-11 09:45:18.432919782 +0000 UTC m=+1886.214070511" lastFinishedPulling="2026-03-11 09:45:18.900164464 +0000 UTC m=+1886.681315163" observedRunningTime="2026-03-11 09:45:19.500105723 +0000 UTC m=+1887.281256442" watchObservedRunningTime="2026-03-11 09:45:19.501788539 +0000 UTC m=+1887.282939228" Mar 11 09:45:21 crc kubenswrapper[4830]: I0311 09:45:21.932557 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:45:22 crc kubenswrapper[4830]: I0311 09:45:22.509426 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"4f3c625d98358eb5bb4ebc7964cb1866ae7af600501322ab72c5f9b2bdd25068"} Mar 11 09:45:23 crc kubenswrapper[4830]: I0311 09:45:23.520572 4830 generic.go:334] "Generic (PLEG): container finished" podID="4d5247a6-36f4-4260-88bd-659f66f5efc0" containerID="5e58459ae4c71cb4501b377ade436c191e30dbd6117c0ca05bce21859b007302" exitCode=0 Mar 11 09:45:23 crc kubenswrapper[4830]: I0311 09:45:23.520650 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" event={"ID":"4d5247a6-36f4-4260-88bd-659f66f5efc0","Type":"ContainerDied","Data":"5e58459ae4c71cb4501b377ade436c191e30dbd6117c0ca05bce21859b007302"} Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.028218 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.218296 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-inventory\") pod \"4d5247a6-36f4-4260-88bd-659f66f5efc0\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.218456 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcrb5\" (UniqueName: \"kubernetes.io/projected/4d5247a6-36f4-4260-88bd-659f66f5efc0-kube-api-access-wcrb5\") pod \"4d5247a6-36f4-4260-88bd-659f66f5efc0\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.218761 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-ssh-key-openstack-edpm-ipam\") pod \"4d5247a6-36f4-4260-88bd-659f66f5efc0\" (UID: \"4d5247a6-36f4-4260-88bd-659f66f5efc0\") " Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.226842 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5247a6-36f4-4260-88bd-659f66f5efc0-kube-api-access-wcrb5" (OuterVolumeSpecName: "kube-api-access-wcrb5") pod "4d5247a6-36f4-4260-88bd-659f66f5efc0" (UID: "4d5247a6-36f4-4260-88bd-659f66f5efc0"). InnerVolumeSpecName "kube-api-access-wcrb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.252682 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-inventory" (OuterVolumeSpecName: "inventory") pod "4d5247a6-36f4-4260-88bd-659f66f5efc0" (UID: "4d5247a6-36f4-4260-88bd-659f66f5efc0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.256166 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4d5247a6-36f4-4260-88bd-659f66f5efc0" (UID: "4d5247a6-36f4-4260-88bd-659f66f5efc0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.321816 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.321847 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d5247a6-36f4-4260-88bd-659f66f5efc0-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.321862 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcrb5\" (UniqueName: \"kubernetes.io/projected/4d5247a6-36f4-4260-88bd-659f66f5efc0-kube-api-access-wcrb5\") on node \"crc\" DevicePath \"\"" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.544312 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" event={"ID":"4d5247a6-36f4-4260-88bd-659f66f5efc0","Type":"ContainerDied","Data":"3d961dc9b997e685d586a489ccd197d7ddc7c98f44616e31625a2a25630f8ba3"} Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.544347 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d961dc9b997e685d586a489ccd197d7ddc7c98f44616e31625a2a25630f8ba3" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.544353 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.624093 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4"] Mar 11 09:45:25 crc kubenswrapper[4830]: E0311 09:45:25.624790 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5247a6-36f4-4260-88bd-659f66f5efc0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.624893 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5247a6-36f4-4260-88bd-659f66f5efc0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.625337 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5247a6-36f4-4260-88bd-659f66f5efc0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.626259 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.628296 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.628546 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.630426 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.630602 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.633643 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4"] Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.729618 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.730271 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.730470 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l685v\" (UniqueName: \"kubernetes.io/projected/5132dffb-d28b-494f-891d-ea13b54a5a72-kube-api-access-l685v\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.832518 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l685v\" (UniqueName: \"kubernetes.io/projected/5132dffb-d28b-494f-891d-ea13b54a5a72-kube-api-access-l685v\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.832748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.832910 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.836931 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.836993 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.851321 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l685v\" (UniqueName: \"kubernetes.io/projected/5132dffb-d28b-494f-891d-ea13b54a5a72-kube-api-access-l685v\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jx2k4\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:25 crc kubenswrapper[4830]: I0311 09:45:25.944525 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:45:26 crc kubenswrapper[4830]: I0311 09:45:26.466098 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4"] Mar 11 09:45:26 crc kubenswrapper[4830]: W0311 09:45:26.466077 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5132dffb_d28b_494f_891d_ea13b54a5a72.slice/crio-3c87d9f3c523de65ec594329b871fbd512b9d02e15cfbbbef5ad072a25bb3e22 WatchSource:0}: Error finding container 3c87d9f3c523de65ec594329b871fbd512b9d02e15cfbbbef5ad072a25bb3e22: Status 404 returned error can't find the container with id 3c87d9f3c523de65ec594329b871fbd512b9d02e15cfbbbef5ad072a25bb3e22 Mar 11 09:45:26 crc kubenswrapper[4830]: I0311 09:45:26.554876 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" event={"ID":"5132dffb-d28b-494f-891d-ea13b54a5a72","Type":"ContainerStarted","Data":"3c87d9f3c523de65ec594329b871fbd512b9d02e15cfbbbef5ad072a25bb3e22"} Mar 11 09:45:27 crc kubenswrapper[4830]: I0311 09:45:27.569765 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" event={"ID":"5132dffb-d28b-494f-891d-ea13b54a5a72","Type":"ContainerStarted","Data":"ac8c92361e813343e6414970ac4851962ecbb1064313643bb65da160d297cdb2"} Mar 11 09:45:27 crc kubenswrapper[4830]: I0311 09:45:27.593965 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" podStartSLOduration=1.892480298 podStartE2EDuration="2.593932078s" podCreationTimestamp="2026-03-11 09:45:25 +0000 UTC" firstStartedPulling="2026-03-11 09:45:26.470192848 +0000 UTC m=+1894.251343547" lastFinishedPulling="2026-03-11 09:45:27.171644628 +0000 UTC m=+1894.952795327" observedRunningTime="2026-03-11 09:45:27.590244958 +0000 UTC m=+1895.371395647" watchObservedRunningTime="2026-03-11 09:45:27.593932078 +0000 UTC m=+1895.375082807" Mar 11 09:45:30 crc kubenswrapper[4830]: I0311 09:45:30.037545 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pl7d8"] Mar 11 09:45:30 crc kubenswrapper[4830]: I0311 09:45:30.043911 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pl7d8"] Mar 11 09:45:30 crc kubenswrapper[4830]: I0311 09:45:30.944280 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="045501ed-58bb-4a38-9b4a-5091217cf610" path="/var/lib/kubelet/pods/045501ed-58bb-4a38-9b4a-5091217cf610/volumes" Mar 11 09:45:31 crc kubenswrapper[4830]: I0311 09:45:31.048755 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lqwp9"] Mar 11 09:45:31 crc kubenswrapper[4830]: I0311 09:45:31.061283 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lqwp9"] Mar 11 09:45:32 crc kubenswrapper[4830]: I0311 09:45:32.945468 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fffcbc2c-4845-4a9d-8709-45eb4a28f0ab" path="/var/lib/kubelet/pods/fffcbc2c-4845-4a9d-8709-45eb4a28f0ab/volumes" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.150351 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553706-5cq7l"] Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.152778 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.155940 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.156612 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.156898 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.162414 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553706-5cq7l"] Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.244133 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lblz\" (UniqueName: \"kubernetes.io/projected/bb31f53a-31eb-4cfc-90c4-f3a508e746b3-kube-api-access-8lblz\") pod \"auto-csr-approver-29553706-5cq7l\" (UID: \"bb31f53a-31eb-4cfc-90c4-f3a508e746b3\") " pod="openshift-infra/auto-csr-approver-29553706-5cq7l" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.345668 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lblz\" (UniqueName: \"kubernetes.io/projected/bb31f53a-31eb-4cfc-90c4-f3a508e746b3-kube-api-access-8lblz\") pod \"auto-csr-approver-29553706-5cq7l\" (UID: \"bb31f53a-31eb-4cfc-90c4-f3a508e746b3\") " pod="openshift-infra/auto-csr-approver-29553706-5cq7l" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.405169 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lblz\" (UniqueName: \"kubernetes.io/projected/bb31f53a-31eb-4cfc-90c4-f3a508e746b3-kube-api-access-8lblz\") pod \"auto-csr-approver-29553706-5cq7l\" (UID: \"bb31f53a-31eb-4cfc-90c4-f3a508e746b3\") " pod="openshift-infra/auto-csr-approver-29553706-5cq7l" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.476822 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.874335 4830 generic.go:334] "Generic (PLEG): container finished" podID="5132dffb-d28b-494f-891d-ea13b54a5a72" containerID="ac8c92361e813343e6414970ac4851962ecbb1064313643bb65da160d297cdb2" exitCode=0 Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.874412 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" event={"ID":"5132dffb-d28b-494f-891d-ea13b54a5a72","Type":"ContainerDied","Data":"ac8c92361e813343e6414970ac4851962ecbb1064313643bb65da160d297cdb2"} Mar 11 09:46:00 crc kubenswrapper[4830]: I0311 09:46:00.929759 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553706-5cq7l"] Mar 11 09:46:01 crc kubenswrapper[4830]: I0311 09:46:01.887636 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" event={"ID":"bb31f53a-31eb-4cfc-90c4-f3a508e746b3","Type":"ContainerStarted","Data":"37bb38bbb0611681d88edbf3a6c993dd9aa4763da715e4a6a436222b69984421"} Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.292721 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.380878 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l685v\" (UniqueName: \"kubernetes.io/projected/5132dffb-d28b-494f-891d-ea13b54a5a72-kube-api-access-l685v\") pod \"5132dffb-d28b-494f-891d-ea13b54a5a72\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.380932 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-inventory\") pod \"5132dffb-d28b-494f-891d-ea13b54a5a72\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.380989 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-ssh-key-openstack-edpm-ipam\") pod \"5132dffb-d28b-494f-891d-ea13b54a5a72\" (UID: \"5132dffb-d28b-494f-891d-ea13b54a5a72\") " Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.390084 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5132dffb-d28b-494f-891d-ea13b54a5a72-kube-api-access-l685v" (OuterVolumeSpecName: "kube-api-access-l685v") pod "5132dffb-d28b-494f-891d-ea13b54a5a72" (UID: "5132dffb-d28b-494f-891d-ea13b54a5a72"). InnerVolumeSpecName "kube-api-access-l685v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.418194 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5132dffb-d28b-494f-891d-ea13b54a5a72" (UID: "5132dffb-d28b-494f-891d-ea13b54a5a72"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.422824 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-inventory" (OuterVolumeSpecName: "inventory") pod "5132dffb-d28b-494f-891d-ea13b54a5a72" (UID: "5132dffb-d28b-494f-891d-ea13b54a5a72"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.482712 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l685v\" (UniqueName: \"kubernetes.io/projected/5132dffb-d28b-494f-891d-ea13b54a5a72-kube-api-access-l685v\") on node \"crc\" DevicePath \"\"" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.482742 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.482751 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5132dffb-d28b-494f-891d-ea13b54a5a72-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.896520 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" event={"ID":"5132dffb-d28b-494f-891d-ea13b54a5a72","Type":"ContainerDied","Data":"3c87d9f3c523de65ec594329b871fbd512b9d02e15cfbbbef5ad072a25bb3e22"} Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.896777 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c87d9f3c523de65ec594329b871fbd512b9d02e15cfbbbef5ad072a25bb3e22" Mar 11 09:46:02 crc kubenswrapper[4830]: I0311 09:46:02.896836 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jx2k4" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.017751 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2"] Mar 11 09:46:03 crc kubenswrapper[4830]: E0311 09:46:03.018294 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5132dffb-d28b-494f-891d-ea13b54a5a72" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.018320 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5132dffb-d28b-494f-891d-ea13b54a5a72" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.018584 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5132dffb-d28b-494f-891d-ea13b54a5a72" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.019249 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.022604 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.022829 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.022985 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.023180 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.026444 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2"] Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.194868 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4dbq\" (UniqueName: \"kubernetes.io/projected/aacf9f52-24a2-462c-8957-3fb5c88988d3-kube-api-access-f4dbq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.194956 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.195129 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.296271 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4dbq\" (UniqueName: \"kubernetes.io/projected/aacf9f52-24a2-462c-8957-3fb5c88988d3-kube-api-access-f4dbq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.296338 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.296422 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.301126 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.301172 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.315739 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4dbq\" (UniqueName: \"kubernetes.io/projected/aacf9f52-24a2-462c-8957-3fb5c88988d3-kube-api-access-f4dbq\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.340545 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.877338 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2"] Mar 11 09:46:03 crc kubenswrapper[4830]: W0311 09:46:03.879677 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice/crio-1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c WatchSource:0}: Error finding container 1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c: Status 404 returned error can't find the container with id 1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.908181 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" event={"ID":"aacf9f52-24a2-462c-8957-3fb5c88988d3","Type":"ContainerStarted","Data":"1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c"} Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.910426 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" event={"ID":"bb31f53a-31eb-4cfc-90c4-f3a508e746b3","Type":"ContainerStarted","Data":"b7eff5f823898959b976ca74d8e087b7766236f5627c5de44f119c5f40f9317e"} Mar 11 09:46:03 crc kubenswrapper[4830]: I0311 09:46:03.927955 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" podStartSLOduration=1.262628814 podStartE2EDuration="3.927935424s" podCreationTimestamp="2026-03-11 09:46:00 +0000 UTC" firstStartedPulling="2026-03-11 09:46:00.938402029 +0000 UTC m=+1928.719552708" lastFinishedPulling="2026-03-11 09:46:03.603708629 +0000 UTC m=+1931.384859318" observedRunningTime="2026-03-11 09:46:03.926819013 +0000 UTC m=+1931.707969712" watchObservedRunningTime="2026-03-11 09:46:03.927935424 +0000 UTC m=+1931.709086113" Mar 11 09:46:04 crc kubenswrapper[4830]: I0311 09:46:04.919757 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" event={"ID":"aacf9f52-24a2-462c-8957-3fb5c88988d3","Type":"ContainerStarted","Data":"fbbe2f39ad075eed983a0237fffe876c545882e5b9520e5d52bf7e9c7e81512d"} Mar 11 09:46:04 crc kubenswrapper[4830]: I0311 09:46:04.921451 4830 generic.go:334] "Generic (PLEG): container finished" podID="bb31f53a-31eb-4cfc-90c4-f3a508e746b3" containerID="b7eff5f823898959b976ca74d8e087b7766236f5627c5de44f119c5f40f9317e" exitCode=0 Mar 11 09:46:04 crc kubenswrapper[4830]: I0311 09:46:04.921503 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" event={"ID":"bb31f53a-31eb-4cfc-90c4-f3a508e746b3","Type":"ContainerDied","Data":"b7eff5f823898959b976ca74d8e087b7766236f5627c5de44f119c5f40f9317e"} Mar 11 09:46:04 crc kubenswrapper[4830]: I0311 09:46:04.960286 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" podStartSLOduration=2.509040557 podStartE2EDuration="2.96026434s" podCreationTimestamp="2026-03-11 09:46:02 +0000 UTC" firstStartedPulling="2026-03-11 09:46:03.88218561 +0000 UTC m=+1931.663336299" lastFinishedPulling="2026-03-11 09:46:04.333409373 +0000 UTC m=+1932.114560082" observedRunningTime="2026-03-11 09:46:04.942792241 +0000 UTC m=+1932.723942940" watchObservedRunningTime="2026-03-11 09:46:04.96026434 +0000 UTC m=+1932.741415019" Mar 11 09:46:06 crc kubenswrapper[4830]: I0311 09:46:06.299742 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" Mar 11 09:46:06 crc kubenswrapper[4830]: I0311 09:46:06.470302 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lblz\" (UniqueName: \"kubernetes.io/projected/bb31f53a-31eb-4cfc-90c4-f3a508e746b3-kube-api-access-8lblz\") pod \"bb31f53a-31eb-4cfc-90c4-f3a508e746b3\" (UID: \"bb31f53a-31eb-4cfc-90c4-f3a508e746b3\") " Mar 11 09:46:06 crc kubenswrapper[4830]: I0311 09:46:06.480325 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb31f53a-31eb-4cfc-90c4-f3a508e746b3-kube-api-access-8lblz" (OuterVolumeSpecName: "kube-api-access-8lblz") pod "bb31f53a-31eb-4cfc-90c4-f3a508e746b3" (UID: "bb31f53a-31eb-4cfc-90c4-f3a508e746b3"). InnerVolumeSpecName "kube-api-access-8lblz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:46:06 crc kubenswrapper[4830]: I0311 09:46:06.572150 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lblz\" (UniqueName: \"kubernetes.io/projected/bb31f53a-31eb-4cfc-90c4-f3a508e746b3-kube-api-access-8lblz\") on node \"crc\" DevicePath \"\"" Mar 11 09:46:06 crc kubenswrapper[4830]: I0311 09:46:06.938831 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" Mar 11 09:46:06 crc kubenswrapper[4830]: I0311 09:46:06.942045 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553706-5cq7l" event={"ID":"bb31f53a-31eb-4cfc-90c4-f3a508e746b3","Type":"ContainerDied","Data":"37bb38bbb0611681d88edbf3a6c993dd9aa4763da715e4a6a436222b69984421"} Mar 11 09:46:06 crc kubenswrapper[4830]: I0311 09:46:06.942080 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37bb38bbb0611681d88edbf3a6c993dd9aa4763da715e4a6a436222b69984421" Mar 11 09:46:07 crc kubenswrapper[4830]: I0311 09:46:07.369096 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553700-smlzg"] Mar 11 09:46:07 crc kubenswrapper[4830]: I0311 09:46:07.377506 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553700-smlzg"] Mar 11 09:46:08 crc kubenswrapper[4830]: I0311 09:46:08.944213 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdff209b-dbf7-4005-b5c6-66091cdfebc0" path="/var/lib/kubelet/pods/bdff209b-dbf7-4005-b5c6-66091cdfebc0/volumes" Mar 11 09:46:11 crc kubenswrapper[4830]: I0311 09:46:11.100895 4830 scope.go:117] "RemoveContainer" containerID="d489ff19093d1cfd7ba882d6f3ecd629bf5e50fd1c1e140deb578d0045dd862c" Mar 11 09:46:11 crc kubenswrapper[4830]: I0311 09:46:11.165245 4830 scope.go:117] "RemoveContainer" containerID="4b8a3a83ef1b2c06935e0d9d5e696aca65f4f0b637c4f517ce8592bd55f0f7fd" Mar 11 09:46:11 crc kubenswrapper[4830]: I0311 09:46:11.232696 4830 scope.go:117] "RemoveContainer" containerID="1cf30dcfc616d5220609e5b56d96126ad66cb4802eee570f8d75ef7e0185826e" Mar 11 09:46:11 crc kubenswrapper[4830]: I0311 09:46:11.292847 4830 scope.go:117] "RemoveContainer" containerID="f5377fb376919a03714090ebd377aa81fa3c44a1c34ef7872ff669a17ba1b3f6" Mar 11 09:46:11 crc kubenswrapper[4830]: I0311 09:46:11.339595 4830 scope.go:117] "RemoveContainer" containerID="66d7bb6bd665ede20693578acfb774acb7111749bd1bcf20640facf724763981" Mar 11 09:46:11 crc kubenswrapper[4830]: I0311 09:46:11.392057 4830 scope.go:117] "RemoveContainer" containerID="a7cdb60fefc1d286dc9762da68be43b5b76aed17670411c8fe3c05d2c4dec84d" Mar 11 09:46:14 crc kubenswrapper[4830]: I0311 09:46:14.046624 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f109-account-create-update-xvvcg"] Mar 11 09:46:14 crc kubenswrapper[4830]: I0311 09:46:14.069790 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f109-account-create-update-xvvcg"] Mar 11 09:46:14 crc kubenswrapper[4830]: I0311 09:46:14.944275 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe70376b-be05-4aba-a39a-850335299924" path="/var/lib/kubelet/pods/fe70376b-be05-4aba-a39a-850335299924/volumes" Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.040444 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gsqqs"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.057343 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2da6-account-create-update-8v4qc"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.067523 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gsqqs"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.077504 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2da6-account-create-update-8v4qc"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.087632 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-t4g4z"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.097824 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-t4g4z"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.106640 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-74ec-account-create-update-qnfw8"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.116239 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4dsqs"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.123554 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-74ec-account-create-update-qnfw8"] Mar 11 09:46:15 crc kubenswrapper[4830]: I0311 09:46:15.129985 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4dsqs"] Mar 11 09:46:16 crc kubenswrapper[4830]: I0311 09:46:16.944047 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a2e8d0-730f-4d64-ad5c-87e35dda7be9" path="/var/lib/kubelet/pods/61a2e8d0-730f-4d64-ad5c-87e35dda7be9/volumes" Mar 11 09:46:16 crc kubenswrapper[4830]: I0311 09:46:16.944911 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="641dc551-b5bc-455e-9deb-20542ef0ab9b" path="/var/lib/kubelet/pods/641dc551-b5bc-455e-9deb-20542ef0ab9b/volumes" Mar 11 09:46:16 crc kubenswrapper[4830]: I0311 09:46:16.945758 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6d682a-4504-4159-852e-36c0e757a98c" path="/var/lib/kubelet/pods/8b6d682a-4504-4159-852e-36c0e757a98c/volumes" Mar 11 09:46:16 crc kubenswrapper[4830]: I0311 09:46:16.946583 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9216b1e8-8423-4bb0-ac8e-c9c9f32e827d" path="/var/lib/kubelet/pods/9216b1e8-8423-4bb0-ac8e-c9c9f32e827d/volumes" Mar 11 09:46:16 crc kubenswrapper[4830]: I0311 09:46:16.947981 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b1c23f-429e-4d5c-85d0-a6cfc1816ae0" path="/var/lib/kubelet/pods/95b1c23f-429e-4d5c-85d0-a6cfc1816ae0/volumes" Mar 11 09:46:43 crc kubenswrapper[4830]: I0311 09:46:43.060817 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmx2j"] Mar 11 09:46:43 crc kubenswrapper[4830]: I0311 09:46:43.070259 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmx2j"] Mar 11 09:46:44 crc kubenswrapper[4830]: I0311 09:46:44.942986 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0cbfba-9a9b-43cd-8d56-b500764edebd" path="/var/lib/kubelet/pods/8f0cbfba-9a9b-43cd-8d56-b500764edebd/volumes" Mar 11 09:46:51 crc kubenswrapper[4830]: I0311 09:46:51.301417 4830 generic.go:334] "Generic (PLEG): container finished" podID="aacf9f52-24a2-462c-8957-3fb5c88988d3" containerID="fbbe2f39ad075eed983a0237fffe876c545882e5b9520e5d52bf7e9c7e81512d" exitCode=0 Mar 11 09:46:51 crc kubenswrapper[4830]: I0311 09:46:51.301510 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" event={"ID":"aacf9f52-24a2-462c-8957-3fb5c88988d3","Type":"ContainerDied","Data":"fbbe2f39ad075eed983a0237fffe876c545882e5b9520e5d52bf7e9c7e81512d"} Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.698193 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.826617 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-ssh-key-openstack-edpm-ipam\") pod \"aacf9f52-24a2-462c-8957-3fb5c88988d3\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.827044 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-inventory\") pod \"aacf9f52-24a2-462c-8957-3fb5c88988d3\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.827161 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4dbq\" (UniqueName: \"kubernetes.io/projected/aacf9f52-24a2-462c-8957-3fb5c88988d3-kube-api-access-f4dbq\") pod \"aacf9f52-24a2-462c-8957-3fb5c88988d3\" (UID: \"aacf9f52-24a2-462c-8957-3fb5c88988d3\") " Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.833221 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aacf9f52-24a2-462c-8957-3fb5c88988d3-kube-api-access-f4dbq" (OuterVolumeSpecName: "kube-api-access-f4dbq") pod "aacf9f52-24a2-462c-8957-3fb5c88988d3" (UID: "aacf9f52-24a2-462c-8957-3fb5c88988d3"). InnerVolumeSpecName "kube-api-access-f4dbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.855312 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aacf9f52-24a2-462c-8957-3fb5c88988d3" (UID: "aacf9f52-24a2-462c-8957-3fb5c88988d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.861305 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-inventory" (OuterVolumeSpecName: "inventory") pod "aacf9f52-24a2-462c-8957-3fb5c88988d3" (UID: "aacf9f52-24a2-462c-8957-3fb5c88988d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.930674 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4dbq\" (UniqueName: \"kubernetes.io/projected/aacf9f52-24a2-462c-8957-3fb5c88988d3-kube-api-access-f4dbq\") on node \"crc\" DevicePath \"\"" Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.930746 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:46:52 crc kubenswrapper[4830]: I0311 09:46:52.930758 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aacf9f52-24a2-462c-8957-3fb5c88988d3-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.320491 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" event={"ID":"aacf9f52-24a2-462c-8957-3fb5c88988d3","Type":"ContainerDied","Data":"1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c"} Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.320538 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.320843 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.413191 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swd9t"] Mar 11 09:46:53 crc kubenswrapper[4830]: E0311 09:46:53.413884 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb31f53a-31eb-4cfc-90c4-f3a508e746b3" containerName="oc" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.413959 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb31f53a-31eb-4cfc-90c4-f3a508e746b3" containerName="oc" Mar 11 09:46:53 crc kubenswrapper[4830]: E0311 09:46:53.414098 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aacf9f52-24a2-462c-8957-3fb5c88988d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.414161 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aacf9f52-24a2-462c-8957-3fb5c88988d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.414410 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="aacf9f52-24a2-462c-8957-3fb5c88988d3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.414499 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb31f53a-31eb-4cfc-90c4-f3a508e746b3" containerName="oc" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.415272 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.417517 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.417824 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.417935 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.418085 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.438478 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swd9t"] Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.540069 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.540204 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.540307 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7sfq\" (UniqueName: \"kubernetes.io/projected/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-kube-api-access-z7sfq\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.642484 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.643154 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.643324 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7sfq\" (UniqueName: \"kubernetes.io/projected/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-kube-api-access-z7sfq\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.649041 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.652333 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.662645 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7sfq\" (UniqueName: \"kubernetes.io/projected/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-kube-api-access-z7sfq\") pod \"ssh-known-hosts-edpm-deployment-swd9t\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:53 crc kubenswrapper[4830]: I0311 09:46:53.736494 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:46:54 crc kubenswrapper[4830]: I0311 09:46:54.236096 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-swd9t"] Mar 11 09:46:54 crc kubenswrapper[4830]: W0311 09:46:54.238507 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21e999cb_ceca_44f0_a7e8_cf0d801e84a7.slice/crio-73834d1d1b258385798acad9db177a79550619794aa90c2d124f16d964fd4210 WatchSource:0}: Error finding container 73834d1d1b258385798acad9db177a79550619794aa90c2d124f16d964fd4210: Status 404 returned error can't find the container with id 73834d1d1b258385798acad9db177a79550619794aa90c2d124f16d964fd4210 Mar 11 09:46:54 crc kubenswrapper[4830]: I0311 09:46:54.332012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" event={"ID":"21e999cb-ceca-44f0-a7e8-cf0d801e84a7","Type":"ContainerStarted","Data":"73834d1d1b258385798acad9db177a79550619794aa90c2d124f16d964fd4210"} Mar 11 09:46:54 crc kubenswrapper[4830]: E0311 09:46:54.785519 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice/crio-1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice\": RecentStats: unable to find data in memory cache]" Mar 11 09:46:56 crc kubenswrapper[4830]: I0311 09:46:56.348686 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" event={"ID":"21e999cb-ceca-44f0-a7e8-cf0d801e84a7","Type":"ContainerStarted","Data":"1a4900582a0c15d35f63b769979d6fdce45a77d4e0df03f18845371651614b0b"} Mar 11 09:47:02 crc kubenswrapper[4830]: I0311 09:47:02.946888 4830 generic.go:334] "Generic (PLEG): container finished" podID="21e999cb-ceca-44f0-a7e8-cf0d801e84a7" containerID="1a4900582a0c15d35f63b769979d6fdce45a77d4e0df03f18845371651614b0b" exitCode=0 Mar 11 09:47:02 crc kubenswrapper[4830]: I0311 09:47:02.946945 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" event={"ID":"21e999cb-ceca-44f0-a7e8-cf0d801e84a7","Type":"ContainerDied","Data":"1a4900582a0c15d35f63b769979d6fdce45a77d4e0df03f18845371651614b0b"} Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.351169 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.450572 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-inventory-0\") pod \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.450686 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-ssh-key-openstack-edpm-ipam\") pod \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.450706 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7sfq\" (UniqueName: \"kubernetes.io/projected/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-kube-api-access-z7sfq\") pod \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\" (UID: \"21e999cb-ceca-44f0-a7e8-cf0d801e84a7\") " Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.455841 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-kube-api-access-z7sfq" (OuterVolumeSpecName: "kube-api-access-z7sfq") pod "21e999cb-ceca-44f0-a7e8-cf0d801e84a7" (UID: "21e999cb-ceca-44f0-a7e8-cf0d801e84a7"). InnerVolumeSpecName "kube-api-access-z7sfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.476792 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "21e999cb-ceca-44f0-a7e8-cf0d801e84a7" (UID: "21e999cb-ceca-44f0-a7e8-cf0d801e84a7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.478338 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "21e999cb-ceca-44f0-a7e8-cf0d801e84a7" (UID: "21e999cb-ceca-44f0-a7e8-cf0d801e84a7"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.552854 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7sfq\" (UniqueName: \"kubernetes.io/projected/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-kube-api-access-z7sfq\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.552888 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.552898 4830 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/21e999cb-ceca-44f0-a7e8-cf0d801e84a7-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.962440 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" event={"ID":"21e999cb-ceca-44f0-a7e8-cf0d801e84a7","Type":"ContainerDied","Data":"73834d1d1b258385798acad9db177a79550619794aa90c2d124f16d964fd4210"} Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.962624 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73834d1d1b258385798acad9db177a79550619794aa90c2d124f16d964fd4210" Mar 11 09:47:04 crc kubenswrapper[4830]: I0311 09:47:04.962462 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-swd9t" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.050479 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7"] Mar 11 09:47:05 crc kubenswrapper[4830]: E0311 09:47:05.051554 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e999cb-ceca-44f0-a7e8-cf0d801e84a7" containerName="ssh-known-hosts-edpm-deployment" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.051643 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e999cb-ceca-44f0-a7e8-cf0d801e84a7" containerName="ssh-known-hosts-edpm-deployment" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.051932 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e999cb-ceca-44f0-a7e8-cf0d801e84a7" containerName="ssh-known-hosts-edpm-deployment" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.052671 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.054900 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.055132 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.061405 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.061491 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:47:05 crc kubenswrapper[4830]: E0311 09:47:05.062131 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice/crio-1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c\": RecentStats: unable to find data in memory cache]" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.075579 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7"] Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.166093 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.166175 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mf8k\" (UniqueName: \"kubernetes.io/projected/9ae7bc18-6614-4094-961f-9590aa0346f4-kube-api-access-9mf8k\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.166642 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.268524 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.268641 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mf8k\" (UniqueName: \"kubernetes.io/projected/9ae7bc18-6614-4094-961f-9590aa0346f4-kube-api-access-9mf8k\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.268749 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.273202 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.273827 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.288682 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mf8k\" (UniqueName: \"kubernetes.io/projected/9ae7bc18-6614-4094-961f-9590aa0346f4-kube-api-access-9mf8k\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xf8b7\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.381963 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.917164 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7"] Mar 11 09:47:05 crc kubenswrapper[4830]: I0311 09:47:05.985178 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" event={"ID":"9ae7bc18-6614-4094-961f-9590aa0346f4","Type":"ContainerStarted","Data":"02938a91aabc6527d5430c12d521d88ea6db564d56cbbd01d1c3ebc5a7b62840"} Mar 11 09:47:06 crc kubenswrapper[4830]: I0311 09:47:06.038793 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-67js2"] Mar 11 09:47:06 crc kubenswrapper[4830]: I0311 09:47:06.046650 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-67js2"] Mar 11 09:47:06 crc kubenswrapper[4830]: I0311 09:47:06.944677 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a498e09-d7b4-4af2-bb1a-dff67b8ce005" path="/var/lib/kubelet/pods/7a498e09-d7b4-4af2-bb1a-dff67b8ce005/volumes" Mar 11 09:47:06 crc kubenswrapper[4830]: I0311 09:47:06.995230 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" event={"ID":"9ae7bc18-6614-4094-961f-9590aa0346f4","Type":"ContainerStarted","Data":"390abc40e26bf10132042ebc310b9bb28a9504c5f8243ab6656fa41ad1c63640"} Mar 11 09:47:07 crc kubenswrapper[4830]: I0311 09:47:07.016697 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" podStartSLOduration=1.466748215 podStartE2EDuration="2.016673213s" podCreationTimestamp="2026-03-11 09:47:05 +0000 UTC" firstStartedPulling="2026-03-11 09:47:05.924842537 +0000 UTC m=+1993.705993226" lastFinishedPulling="2026-03-11 09:47:06.474767525 +0000 UTC m=+1994.255918224" observedRunningTime="2026-03-11 09:47:07.014888515 +0000 UTC m=+1994.796039204" watchObservedRunningTime="2026-03-11 09:47:07.016673213 +0000 UTC m=+1994.797823932" Mar 11 09:47:08 crc kubenswrapper[4830]: I0311 09:47:08.040325 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d9ntp"] Mar 11 09:47:08 crc kubenswrapper[4830]: I0311 09:47:08.050469 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d9ntp"] Mar 11 09:47:08 crc kubenswrapper[4830]: I0311 09:47:08.943141 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5849038f-38d8-48c8-a4d8-70dc0166cdf9" path="/var/lib/kubelet/pods/5849038f-38d8-48c8-a4d8-70dc0166cdf9/volumes" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.589335 4830 scope.go:117] "RemoveContainer" containerID="de47a77de53b1045546f55aa83ee6761d8f76ca7da73291f8e773734cb17a46b" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.625189 4830 scope.go:117] "RemoveContainer" containerID="e7bc08a3b17cbc5f85b1475599e86b823bd791b4e505f87a6808f116e94b420e" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.673472 4830 scope.go:117] "RemoveContainer" containerID="cf56e493f1cb721439ce09910e2edefc4bca70367610b1eaef3ea49bbbd618de" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.717419 4830 scope.go:117] "RemoveContainer" containerID="c24780ade9f15d78f78d1861d632e31d983b213a8e04fc73edb39ca91e6b403a" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.787954 4830 scope.go:117] "RemoveContainer" containerID="5473ea6d9d3da892e42d77657dcb21aa2384b3204229d921b89cd169b0193028" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.840458 4830 scope.go:117] "RemoveContainer" containerID="0d8fdb376611f5d58a9469459b7dc7d5122fc75221799e93480fd1daba7014b2" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.864001 4830 scope.go:117] "RemoveContainer" containerID="8ca726f13d882498313f36c1d035d7b93adecd23cc546d081e01a0b18beb2ecd" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.885061 4830 scope.go:117] "RemoveContainer" containerID="db59682059ad5d3852437c009d1ec91a3b7d8c392986953cfd586a2f8968ab27" Mar 11 09:47:11 crc kubenswrapper[4830]: I0311 09:47:11.918870 4830 scope.go:117] "RemoveContainer" containerID="cd57baeb947c14b301bd8ef404f1f924755b9953c55251d099a0a982e7dc7463" Mar 11 09:47:14 crc kubenswrapper[4830]: I0311 09:47:14.054142 4830 generic.go:334] "Generic (PLEG): container finished" podID="9ae7bc18-6614-4094-961f-9590aa0346f4" containerID="390abc40e26bf10132042ebc310b9bb28a9504c5f8243ab6656fa41ad1c63640" exitCode=0 Mar 11 09:47:14 crc kubenswrapper[4830]: I0311 09:47:14.054195 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" event={"ID":"9ae7bc18-6614-4094-961f-9590aa0346f4","Type":"ContainerDied","Data":"390abc40e26bf10132042ebc310b9bb28a9504c5f8243ab6656fa41ad1c63640"} Mar 11 09:47:15 crc kubenswrapper[4830]: E0311 09:47:15.328852 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice/crio-1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice\": RecentStats: unable to find data in memory cache]" Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.466312 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.663847 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-ssh-key-openstack-edpm-ipam\") pod \"9ae7bc18-6614-4094-961f-9590aa0346f4\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.664040 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mf8k\" (UniqueName: \"kubernetes.io/projected/9ae7bc18-6614-4094-961f-9590aa0346f4-kube-api-access-9mf8k\") pod \"9ae7bc18-6614-4094-961f-9590aa0346f4\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.664122 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-inventory\") pod \"9ae7bc18-6614-4094-961f-9590aa0346f4\" (UID: \"9ae7bc18-6614-4094-961f-9590aa0346f4\") " Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.670253 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae7bc18-6614-4094-961f-9590aa0346f4-kube-api-access-9mf8k" (OuterVolumeSpecName: "kube-api-access-9mf8k") pod "9ae7bc18-6614-4094-961f-9590aa0346f4" (UID: "9ae7bc18-6614-4094-961f-9590aa0346f4"). InnerVolumeSpecName "kube-api-access-9mf8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.739951 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ae7bc18-6614-4094-961f-9590aa0346f4" (UID: "9ae7bc18-6614-4094-961f-9590aa0346f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.741586 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-inventory" (OuterVolumeSpecName: "inventory") pod "9ae7bc18-6614-4094-961f-9590aa0346f4" (UID: "9ae7bc18-6614-4094-961f-9590aa0346f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.766722 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.766759 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mf8k\" (UniqueName: \"kubernetes.io/projected/9ae7bc18-6614-4094-961f-9590aa0346f4-kube-api-access-9mf8k\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:15 crc kubenswrapper[4830]: I0311 09:47:15.766777 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ae7bc18-6614-4094-961f-9590aa0346f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.074698 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" event={"ID":"9ae7bc18-6614-4094-961f-9590aa0346f4","Type":"ContainerDied","Data":"02938a91aabc6527d5430c12d521d88ea6db564d56cbbd01d1c3ebc5a7b62840"} Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.074964 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02938a91aabc6527d5430c12d521d88ea6db564d56cbbd01d1c3ebc5a7b62840" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.074787 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xf8b7" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.133463 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56"] Mar 11 09:47:16 crc kubenswrapper[4830]: E0311 09:47:16.133832 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae7bc18-6614-4094-961f-9590aa0346f4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.133856 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae7bc18-6614-4094-961f-9590aa0346f4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.134074 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae7bc18-6614-4094-961f-9590aa0346f4" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.134789 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.138149 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.138376 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.138723 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.138729 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.145982 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56"] Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.276142 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pk5b\" (UniqueName: \"kubernetes.io/projected/592c8d08-ac0e-4665-9d65-e362412b7867-kube-api-access-4pk5b\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.276670 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.276729 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.378274 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pk5b\" (UniqueName: \"kubernetes.io/projected/592c8d08-ac0e-4665-9d65-e362412b7867-kube-api-access-4pk5b\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.378412 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.378437 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.391190 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.393173 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.398613 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pk5b\" (UniqueName: \"kubernetes.io/projected/592c8d08-ac0e-4665-9d65-e362412b7867-kube-api-access-4pk5b\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:16 crc kubenswrapper[4830]: I0311 09:47:16.454503 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:17 crc kubenswrapper[4830]: I0311 09:47:17.031926 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56"] Mar 11 09:47:17 crc kubenswrapper[4830]: I0311 09:47:17.084143 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" event={"ID":"592c8d08-ac0e-4665-9d65-e362412b7867","Type":"ContainerStarted","Data":"4b44734e42e9768c2930322ab682163a014d0f648bab4789874a1c0859ced6d3"} Mar 11 09:47:19 crc kubenswrapper[4830]: I0311 09:47:19.106063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" event={"ID":"592c8d08-ac0e-4665-9d65-e362412b7867","Type":"ContainerStarted","Data":"c03b651d842d0e3ba6584a4dc44384eef0a7d556d253949abb9ae9ecdd3edf17"} Mar 11 09:47:19 crc kubenswrapper[4830]: I0311 09:47:19.135611 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" podStartSLOduration=2.31702646 podStartE2EDuration="3.135565199s" podCreationTimestamp="2026-03-11 09:47:16 +0000 UTC" firstStartedPulling="2026-03-11 09:47:17.04841665 +0000 UTC m=+2004.829567379" lastFinishedPulling="2026-03-11 09:47:17.866955419 +0000 UTC m=+2005.648106118" observedRunningTime="2026-03-11 09:47:19.124430774 +0000 UTC m=+2006.905581473" watchObservedRunningTime="2026-03-11 09:47:19.135565199 +0000 UTC m=+2006.916715888" Mar 11 09:47:25 crc kubenswrapper[4830]: E0311 09:47:25.553346 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice/crio-1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c\": RecentStats: unable to find data in memory cache]" Mar 11 09:47:27 crc kubenswrapper[4830]: I0311 09:47:27.191356 4830 generic.go:334] "Generic (PLEG): container finished" podID="592c8d08-ac0e-4665-9d65-e362412b7867" containerID="c03b651d842d0e3ba6584a4dc44384eef0a7d556d253949abb9ae9ecdd3edf17" exitCode=0 Mar 11 09:47:27 crc kubenswrapper[4830]: I0311 09:47:27.191406 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" event={"ID":"592c8d08-ac0e-4665-9d65-e362412b7867","Type":"ContainerDied","Data":"c03b651d842d0e3ba6584a4dc44384eef0a7d556d253949abb9ae9ecdd3edf17"} Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.560685 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.634974 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-ssh-key-openstack-edpm-ipam\") pod \"592c8d08-ac0e-4665-9d65-e362412b7867\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.635350 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-inventory\") pod \"592c8d08-ac0e-4665-9d65-e362412b7867\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.635398 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pk5b\" (UniqueName: \"kubernetes.io/projected/592c8d08-ac0e-4665-9d65-e362412b7867-kube-api-access-4pk5b\") pod \"592c8d08-ac0e-4665-9d65-e362412b7867\" (UID: \"592c8d08-ac0e-4665-9d65-e362412b7867\") " Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.644598 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/592c8d08-ac0e-4665-9d65-e362412b7867-kube-api-access-4pk5b" (OuterVolumeSpecName: "kube-api-access-4pk5b") pod "592c8d08-ac0e-4665-9d65-e362412b7867" (UID: "592c8d08-ac0e-4665-9d65-e362412b7867"). InnerVolumeSpecName "kube-api-access-4pk5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.662743 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "592c8d08-ac0e-4665-9d65-e362412b7867" (UID: "592c8d08-ac0e-4665-9d65-e362412b7867"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.669325 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-inventory" (OuterVolumeSpecName: "inventory") pod "592c8d08-ac0e-4665-9d65-e362412b7867" (UID: "592c8d08-ac0e-4665-9d65-e362412b7867"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.737843 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.737883 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/592c8d08-ac0e-4665-9d65-e362412b7867-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:28 crc kubenswrapper[4830]: I0311 09:47:28.737893 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pk5b\" (UniqueName: \"kubernetes.io/projected/592c8d08-ac0e-4665-9d65-e362412b7867-kube-api-access-4pk5b\") on node \"crc\" DevicePath \"\"" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.230722 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" event={"ID":"592c8d08-ac0e-4665-9d65-e362412b7867","Type":"ContainerDied","Data":"4b44734e42e9768c2930322ab682163a014d0f648bab4789874a1c0859ced6d3"} Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.230785 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b44734e42e9768c2930322ab682163a014d0f648bab4789874a1c0859ced6d3" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.230836 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.293012 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg"] Mar 11 09:47:29 crc kubenswrapper[4830]: E0311 09:47:29.293484 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="592c8d08-ac0e-4665-9d65-e362412b7867" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.293501 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="592c8d08-ac0e-4665-9d65-e362412b7867" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.293665 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="592c8d08-ac0e-4665-9d65-e362412b7867" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.294354 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.296431 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.296783 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.296867 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.296928 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.296946 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.296934 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.297863 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.299393 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.314153 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg"] Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.346768 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.346993 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347102 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347343 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347428 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347462 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347581 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347629 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347659 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347699 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347749 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hqqv\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-kube-api-access-8hqqv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347774 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347794 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.347825 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449626 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449690 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hqqv\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-kube-api-access-8hqqv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449711 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449733 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449769 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449800 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449861 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449881 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449905 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449926 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449943 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.449983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.450007 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.450064 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.456077 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.456075 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.456798 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.456823 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.457290 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.457572 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.460618 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.461109 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.461753 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.462139 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.462543 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.462713 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.465097 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.468363 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hqqv\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-kube-api-access-8hqqv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-672vg\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.612677 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:47:29 crc kubenswrapper[4830]: I0311 09:47:29.927486 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg"] Mar 11 09:47:29 crc kubenswrapper[4830]: W0311 09:47:29.930093 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13dfb6c4_9546_4a13_bc42_842a71c96c6c.slice/crio-80e921fecb2b89f2e3b60d27663e1bf62436d266328c3c0c2d62ab2b29d5b74a WatchSource:0}: Error finding container 80e921fecb2b89f2e3b60d27663e1bf62436d266328c3c0c2d62ab2b29d5b74a: Status 404 returned error can't find the container with id 80e921fecb2b89f2e3b60d27663e1bf62436d266328c3c0c2d62ab2b29d5b74a Mar 11 09:47:30 crc kubenswrapper[4830]: I0311 09:47:30.241563 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" event={"ID":"13dfb6c4-9546-4a13-bc42-842a71c96c6c","Type":"ContainerStarted","Data":"80e921fecb2b89f2e3b60d27663e1bf62436d266328c3c0c2d62ab2b29d5b74a"} Mar 11 09:47:31 crc kubenswrapper[4830]: I0311 09:47:31.251897 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" event={"ID":"13dfb6c4-9546-4a13-bc42-842a71c96c6c","Type":"ContainerStarted","Data":"10d90aff36e840e22a25f71e7b403ed033dd7587c0effb18d69383b12f6443ad"} Mar 11 09:47:31 crc kubenswrapper[4830]: I0311 09:47:31.274168 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" podStartSLOduration=1.649261986 podStartE2EDuration="2.274153485s" podCreationTimestamp="2026-03-11 09:47:29 +0000 UTC" firstStartedPulling="2026-03-11 09:47:29.933760886 +0000 UTC m=+2017.714911575" lastFinishedPulling="2026-03-11 09:47:30.558652385 +0000 UTC m=+2018.339803074" observedRunningTime="2026-03-11 09:47:31.27399514 +0000 UTC m=+2019.055145829" watchObservedRunningTime="2026-03-11 09:47:31.274153485 +0000 UTC m=+2019.055304174" Mar 11 09:47:35 crc kubenswrapper[4830]: E0311 09:47:35.771547 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice/crio-1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice\": RecentStats: unable to find data in memory cache]" Mar 11 09:47:43 crc kubenswrapper[4830]: I0311 09:47:43.060794 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:47:43 crc kubenswrapper[4830]: I0311 09:47:43.062180 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:47:46 crc kubenswrapper[4830]: E0311 09:47:46.054377 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaacf9f52_24a2_462c_8957_3fb5c88988d3.slice/crio-1ee38e578821fa6b589da2f8d7ff1b83b3a48ed07c3492c62bc36eff10fb0f0c\": RecentStats: unable to find data in memory cache]" Mar 11 09:47:52 crc kubenswrapper[4830]: I0311 09:47:52.046741 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7p4r5"] Mar 11 09:47:52 crc kubenswrapper[4830]: I0311 09:47:52.054617 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7p4r5"] Mar 11 09:47:52 crc kubenswrapper[4830]: I0311 09:47:52.942819 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76673d7a-d07c-4dd4-804b-c18820921185" path="/var/lib/kubelet/pods/76673d7a-d07c-4dd4-804b-c18820921185/volumes" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.150385 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553708-ctxh8"] Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.152245 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553708-ctxh8" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.156362 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.156452 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.156520 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.164140 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553708-ctxh8"] Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.324314 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssqb\" (UniqueName: \"kubernetes.io/projected/b64cb553-4d78-4d92-be2d-191073aaa5e5-kube-api-access-kssqb\") pod \"auto-csr-approver-29553708-ctxh8\" (UID: \"b64cb553-4d78-4d92-be2d-191073aaa5e5\") " pod="openshift-infra/auto-csr-approver-29553708-ctxh8" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.426788 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kssqb\" (UniqueName: \"kubernetes.io/projected/b64cb553-4d78-4d92-be2d-191073aaa5e5-kube-api-access-kssqb\") pod \"auto-csr-approver-29553708-ctxh8\" (UID: \"b64cb553-4d78-4d92-be2d-191073aaa5e5\") " pod="openshift-infra/auto-csr-approver-29553708-ctxh8" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.447014 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssqb\" (UniqueName: \"kubernetes.io/projected/b64cb553-4d78-4d92-be2d-191073aaa5e5-kube-api-access-kssqb\") pod \"auto-csr-approver-29553708-ctxh8\" (UID: \"b64cb553-4d78-4d92-be2d-191073aaa5e5\") " pod="openshift-infra/auto-csr-approver-29553708-ctxh8" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.476573 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553708-ctxh8" Mar 11 09:48:00 crc kubenswrapper[4830]: I0311 09:48:00.952088 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553708-ctxh8"] Mar 11 09:48:01 crc kubenswrapper[4830]: I0311 09:48:01.547675 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553708-ctxh8" event={"ID":"b64cb553-4d78-4d92-be2d-191073aaa5e5","Type":"ContainerStarted","Data":"84430eaf5c7ca6f172fa7b199044bc7c51a12b4bc88320aa76c3faaca19866ab"} Mar 11 09:48:02 crc kubenswrapper[4830]: I0311 09:48:02.559339 4830 generic.go:334] "Generic (PLEG): container finished" podID="b64cb553-4d78-4d92-be2d-191073aaa5e5" containerID="90e65b1c2d89a74fcafd8f1e9429f4aa37f35143da58e57bcbd7008ef15b61f0" exitCode=0 Mar 11 09:48:02 crc kubenswrapper[4830]: I0311 09:48:02.559424 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553708-ctxh8" event={"ID":"b64cb553-4d78-4d92-be2d-191073aaa5e5","Type":"ContainerDied","Data":"90e65b1c2d89a74fcafd8f1e9429f4aa37f35143da58e57bcbd7008ef15b61f0"} Mar 11 09:48:03 crc kubenswrapper[4830]: I0311 09:48:03.892540 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553708-ctxh8" Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:03.999844 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kssqb\" (UniqueName: \"kubernetes.io/projected/b64cb553-4d78-4d92-be2d-191073aaa5e5-kube-api-access-kssqb\") pod \"b64cb553-4d78-4d92-be2d-191073aaa5e5\" (UID: \"b64cb553-4d78-4d92-be2d-191073aaa5e5\") " Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:04.005103 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b64cb553-4d78-4d92-be2d-191073aaa5e5-kube-api-access-kssqb" (OuterVolumeSpecName: "kube-api-access-kssqb") pod "b64cb553-4d78-4d92-be2d-191073aaa5e5" (UID: "b64cb553-4d78-4d92-be2d-191073aaa5e5"). InnerVolumeSpecName "kube-api-access-kssqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:04.102790 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kssqb\" (UniqueName: \"kubernetes.io/projected/b64cb553-4d78-4d92-be2d-191073aaa5e5-kube-api-access-kssqb\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:04.578096 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553708-ctxh8" event={"ID":"b64cb553-4d78-4d92-be2d-191073aaa5e5","Type":"ContainerDied","Data":"84430eaf5c7ca6f172fa7b199044bc7c51a12b4bc88320aa76c3faaca19866ab"} Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:04.578145 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84430eaf5c7ca6f172fa7b199044bc7c51a12b4bc88320aa76c3faaca19866ab" Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:04.578146 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553708-ctxh8" Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:04.948720 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553702-qk2gt"] Mar 11 09:48:04 crc kubenswrapper[4830]: I0311 09:48:04.956667 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553702-qk2gt"] Mar 11 09:48:05 crc kubenswrapper[4830]: I0311 09:48:05.596501 4830 generic.go:334] "Generic (PLEG): container finished" podID="13dfb6c4-9546-4a13-bc42-842a71c96c6c" containerID="10d90aff36e840e22a25f71e7b403ed033dd7587c0effb18d69383b12f6443ad" exitCode=0 Mar 11 09:48:05 crc kubenswrapper[4830]: I0311 09:48:05.596832 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" event={"ID":"13dfb6c4-9546-4a13-bc42-842a71c96c6c","Type":"ContainerDied","Data":"10d90aff36e840e22a25f71e7b403ed033dd7587c0effb18d69383b12f6443ad"} Mar 11 09:48:06 crc kubenswrapper[4830]: I0311 09:48:06.948435 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92bb55df-aad1-44a2-88f4-e61040246ef8" path="/var/lib/kubelet/pods/92bb55df-aad1-44a2-88f4-e61040246ef8/volumes" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.025991 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159090 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159476 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159530 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ovn-combined-ca-bundle\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159575 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-nova-combined-ca-bundle\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159645 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-telemetry-combined-ca-bundle\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159676 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-neutron-metadata-combined-ca-bundle\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159702 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159750 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-libvirt-combined-ca-bundle\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159781 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159851 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-inventory\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-repo-setup-combined-ca-bundle\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159903 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-bootstrap-combined-ca-bundle\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159939 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ssh-key-openstack-edpm-ipam\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.159973 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hqqv\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-kube-api-access-8hqqv\") pod \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\" (UID: \"13dfb6c4-9546-4a13-bc42-842a71c96c6c\") " Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.165920 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-kube-api-access-8hqqv" (OuterVolumeSpecName: "kube-api-access-8hqqv") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "kube-api-access-8hqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.167621 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.167634 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.167959 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.169724 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.169934 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.170091 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.170229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.171459 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.171860 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.171881 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.172457 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.195639 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.200149 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-inventory" (OuterVolumeSpecName: "inventory") pod "13dfb6c4-9546-4a13-bc42-842a71c96c6c" (UID: "13dfb6c4-9546-4a13-bc42-842a71c96c6c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262342 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262385 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262400 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262415 4830 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262427 4830 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262439 4830 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262455 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262469 4830 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262481 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262495 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262506 4830 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262517 4830 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262531 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13dfb6c4-9546-4a13-bc42-842a71c96c6c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.262544 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hqqv\" (UniqueName: \"kubernetes.io/projected/13dfb6c4-9546-4a13-bc42-842a71c96c6c-kube-api-access-8hqqv\") on node \"crc\" DevicePath \"\"" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.614063 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" event={"ID":"13dfb6c4-9546-4a13-bc42-842a71c96c6c","Type":"ContainerDied","Data":"80e921fecb2b89f2e3b60d27663e1bf62436d266328c3c0c2d62ab2b29d5b74a"} Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.614106 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e921fecb2b89f2e3b60d27663e1bf62436d266328c3c0c2d62ab2b29d5b74a" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.614136 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-672vg" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.716523 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x"] Mar 11 09:48:07 crc kubenswrapper[4830]: E0311 09:48:07.716972 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dfb6c4-9546-4a13-bc42-842a71c96c6c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.717000 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dfb6c4-9546-4a13-bc42-842a71c96c6c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 11 09:48:07 crc kubenswrapper[4830]: E0311 09:48:07.717051 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b64cb553-4d78-4d92-be2d-191073aaa5e5" containerName="oc" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.717060 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b64cb553-4d78-4d92-be2d-191073aaa5e5" containerName="oc" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.717262 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b64cb553-4d78-4d92-be2d-191073aaa5e5" containerName="oc" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.717278 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dfb6c4-9546-4a13-bc42-842a71c96c6c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.718097 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.720620 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.720688 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.720640 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.721993 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.722355 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.726141 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x"] Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.873819 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.873878 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.873949 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.874213 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg2tz\" (UniqueName: \"kubernetes.io/projected/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-kube-api-access-mg2tz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.874270 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.976203 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.976369 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.976440 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg2tz\" (UniqueName: \"kubernetes.io/projected/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-kube-api-access-mg2tz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.976532 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.976578 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.977251 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.981158 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.981641 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.981824 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:07 crc kubenswrapper[4830]: I0311 09:48:07.998663 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg2tz\" (UniqueName: \"kubernetes.io/projected/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-kube-api-access-mg2tz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-kqw9x\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:08 crc kubenswrapper[4830]: I0311 09:48:08.034179 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:48:08 crc kubenswrapper[4830]: I0311 09:48:08.534658 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x"] Mar 11 09:48:08 crc kubenswrapper[4830]: I0311 09:48:08.625121 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" event={"ID":"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3","Type":"ContainerStarted","Data":"4535ffd7725d52237c4ca25ff543bf80f6969f21431e5f460c6c11ec28deae22"} Mar 11 09:48:09 crc kubenswrapper[4830]: I0311 09:48:09.635847 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" event={"ID":"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3","Type":"ContainerStarted","Data":"764fb3c99292c8856e3747cb48e5cf92fba13c8237770b3ec718beaa28e3d58e"} Mar 11 09:48:09 crc kubenswrapper[4830]: I0311 09:48:09.657863 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" podStartSLOduration=1.934926664 podStartE2EDuration="2.657837064s" podCreationTimestamp="2026-03-11 09:48:07 +0000 UTC" firstStartedPulling="2026-03-11 09:48:08.537082995 +0000 UTC m=+2056.318233684" lastFinishedPulling="2026-03-11 09:48:09.259993395 +0000 UTC m=+2057.041144084" observedRunningTime="2026-03-11 09:48:09.654129704 +0000 UTC m=+2057.435280413" watchObservedRunningTime="2026-03-11 09:48:09.657837064 +0000 UTC m=+2057.438987753" Mar 11 09:48:12 crc kubenswrapper[4830]: I0311 09:48:12.076737 4830 scope.go:117] "RemoveContainer" containerID="61ece24803ef2b400cdff4a524abdc00b253867f21f2545cb593bcdda494db78" Mar 11 09:48:12 crc kubenswrapper[4830]: I0311 09:48:12.117964 4830 scope.go:117] "RemoveContainer" containerID="f9eedf5d32bbdc2d41dcdd433c5ff6b20b9a5419f104c5b6dc02d8c361f10797" Mar 11 09:48:13 crc kubenswrapper[4830]: I0311 09:48:13.060734 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:48:13 crc kubenswrapper[4830]: I0311 09:48:13.061601 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.060092 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.060611 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.060652 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.061378 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f3c625d98358eb5bb4ebc7964cb1866ae7af600501322ab72c5f9b2bdd25068"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.061426 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://4f3c625d98358eb5bb4ebc7964cb1866ae7af600501322ab72c5f9b2bdd25068" gracePeriod=600 Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.922229 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="4f3c625d98358eb5bb4ebc7964cb1866ae7af600501322ab72c5f9b2bdd25068" exitCode=0 Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.922321 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"4f3c625d98358eb5bb4ebc7964cb1866ae7af600501322ab72c5f9b2bdd25068"} Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.922743 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8"} Mar 11 09:48:43 crc kubenswrapper[4830]: I0311 09:48:43.922768 4830 scope.go:117] "RemoveContainer" containerID="bacce352f2db8f63cf6436e73d26cc73d0ae77c96a68e734327bb01e27b8789f" Mar 11 09:49:08 crc kubenswrapper[4830]: I0311 09:49:08.307848 4830 generic.go:334] "Generic (PLEG): container finished" podID="ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" containerID="764fb3c99292c8856e3747cb48e5cf92fba13c8237770b3ec718beaa28e3d58e" exitCode=0 Mar 11 09:49:08 crc kubenswrapper[4830]: I0311 09:49:08.307998 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" event={"ID":"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3","Type":"ContainerDied","Data":"764fb3c99292c8856e3747cb48e5cf92fba13c8237770b3ec718beaa28e3d58e"} Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.781386 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.966510 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg2tz\" (UniqueName: \"kubernetes.io/projected/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-kube-api-access-mg2tz\") pod \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.966796 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovncontroller-config-0\") pod \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.966877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ssh-key-openstack-edpm-ipam\") pod \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.966913 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-inventory\") pod \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.966943 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovn-combined-ca-bundle\") pod \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\" (UID: \"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3\") " Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.973896 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" (UID: "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.975508 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-kube-api-access-mg2tz" (OuterVolumeSpecName: "kube-api-access-mg2tz") pod "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" (UID: "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3"). InnerVolumeSpecName "kube-api-access-mg2tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.995574 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" (UID: "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:49:09 crc kubenswrapper[4830]: I0311 09:49:09.998949 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-inventory" (OuterVolumeSpecName: "inventory") pod "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" (UID: "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.000658 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" (UID: "ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.070228 4830 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.070376 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.070400 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.070418 4830 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.070431 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg2tz\" (UniqueName: \"kubernetes.io/projected/ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3-kube-api-access-mg2tz\") on node \"crc\" DevicePath \"\"" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.326944 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" event={"ID":"ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3","Type":"ContainerDied","Data":"4535ffd7725d52237c4ca25ff543bf80f6969f21431e5f460c6c11ec28deae22"} Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.327231 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4535ffd7725d52237c4ca25ff543bf80f6969f21431e5f460c6c11ec28deae22" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.326991 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-kqw9x" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.454122 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85"] Mar 11 09:49:10 crc kubenswrapper[4830]: E0311 09:49:10.454528 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.454543 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.454721 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.455451 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.458088 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.459451 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.463558 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.463690 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.463789 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.467333 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85"] Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.467457 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.478284 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.478364 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.478485 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.478574 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.478609 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.478633 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwkq4\" (UniqueName: \"kubernetes.io/projected/48be5fba-f61d-4475-bbaf-df6ece9da972-kube-api-access-lwkq4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.580383 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.580434 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.580455 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwkq4\" (UniqueName: \"kubernetes.io/projected/48be5fba-f61d-4475-bbaf-df6ece9da972-kube-api-access-lwkq4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.580549 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.580578 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.581278 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.584592 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.584830 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.585331 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.586421 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.587266 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.597070 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwkq4\" (UniqueName: \"kubernetes.io/projected/48be5fba-f61d-4475-bbaf-df6ece9da972-kube-api-access-lwkq4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:10 crc kubenswrapper[4830]: I0311 09:49:10.771468 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:49:11 crc kubenswrapper[4830]: I0311 09:49:11.323453 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85"] Mar 11 09:49:12 crc kubenswrapper[4830]: I0311 09:49:12.353095 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" event={"ID":"48be5fba-f61d-4475-bbaf-df6ece9da972","Type":"ContainerStarted","Data":"0d97f225e744b7888fd426578b489a5651282428e2e4b283d1bd1cc84ee5d153"} Mar 11 09:49:12 crc kubenswrapper[4830]: I0311 09:49:12.353629 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" event={"ID":"48be5fba-f61d-4475-bbaf-df6ece9da972","Type":"ContainerStarted","Data":"ce06127508f116dc38549cd2c949f96a5a6ef6f653d77b6af0b71d418d003b80"} Mar 11 09:49:12 crc kubenswrapper[4830]: I0311 09:49:12.374451 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" podStartSLOduration=1.9192159370000001 podStartE2EDuration="2.374429478s" podCreationTimestamp="2026-03-11 09:49:10 +0000 UTC" firstStartedPulling="2026-03-11 09:49:11.336253332 +0000 UTC m=+2119.117404031" lastFinishedPulling="2026-03-11 09:49:11.791466843 +0000 UTC m=+2119.572617572" observedRunningTime="2026-03-11 09:49:12.368634601 +0000 UTC m=+2120.149785310" watchObservedRunningTime="2026-03-11 09:49:12.374429478 +0000 UTC m=+2120.155580177" Mar 11 09:49:58 crc kubenswrapper[4830]: I0311 09:49:58.846011 4830 generic.go:334] "Generic (PLEG): container finished" podID="48be5fba-f61d-4475-bbaf-df6ece9da972" containerID="0d97f225e744b7888fd426578b489a5651282428e2e4b283d1bd1cc84ee5d153" exitCode=0 Mar 11 09:49:58 crc kubenswrapper[4830]: I0311 09:49:58.846054 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" event={"ID":"48be5fba-f61d-4475-bbaf-df6ece9da972","Type":"ContainerDied","Data":"0d97f225e744b7888fd426578b489a5651282428e2e4b283d1bd1cc84ee5d153"} Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.146901 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553710-wzsnn"] Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.149475 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553710-wzsnn" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.152128 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.152139 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.152377 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.162117 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553710-wzsnn"] Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.258100 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.266667 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgfhv\" (UniqueName: \"kubernetes.io/projected/777ba606-ef62-40e8-8c74-59538066c64f-kube-api-access-vgfhv\") pod \"auto-csr-approver-29553710-wzsnn\" (UID: \"777ba606-ef62-40e8-8c74-59538066c64f\") " pod="openshift-infra/auto-csr-approver-29553710-wzsnn" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.367726 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-inventory\") pod \"48be5fba-f61d-4475-bbaf-df6ece9da972\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.367826 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-ssh-key-openstack-edpm-ipam\") pod \"48be5fba-f61d-4475-bbaf-df6ece9da972\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.367878 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-nova-metadata-neutron-config-0\") pod \"48be5fba-f61d-4475-bbaf-df6ece9da972\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.367922 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-metadata-combined-ca-bundle\") pod \"48be5fba-f61d-4475-bbaf-df6ece9da972\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.368048 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwkq4\" (UniqueName: \"kubernetes.io/projected/48be5fba-f61d-4475-bbaf-df6ece9da972-kube-api-access-lwkq4\") pod \"48be5fba-f61d-4475-bbaf-df6ece9da972\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.368075 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-ovn-metadata-agent-neutron-config-0\") pod \"48be5fba-f61d-4475-bbaf-df6ece9da972\" (UID: \"48be5fba-f61d-4475-bbaf-df6ece9da972\") " Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.368353 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgfhv\" (UniqueName: \"kubernetes.io/projected/777ba606-ef62-40e8-8c74-59538066c64f-kube-api-access-vgfhv\") pod \"auto-csr-approver-29553710-wzsnn\" (UID: \"777ba606-ef62-40e8-8c74-59538066c64f\") " pod="openshift-infra/auto-csr-approver-29553710-wzsnn" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.374557 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48be5fba-f61d-4475-bbaf-df6ece9da972-kube-api-access-lwkq4" (OuterVolumeSpecName: "kube-api-access-lwkq4") pod "48be5fba-f61d-4475-bbaf-df6ece9da972" (UID: "48be5fba-f61d-4475-bbaf-df6ece9da972"). InnerVolumeSpecName "kube-api-access-lwkq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.374871 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "48be5fba-f61d-4475-bbaf-df6ece9da972" (UID: "48be5fba-f61d-4475-bbaf-df6ece9da972"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.385794 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgfhv\" (UniqueName: \"kubernetes.io/projected/777ba606-ef62-40e8-8c74-59538066c64f-kube-api-access-vgfhv\") pod \"auto-csr-approver-29553710-wzsnn\" (UID: \"777ba606-ef62-40e8-8c74-59538066c64f\") " pod="openshift-infra/auto-csr-approver-29553710-wzsnn" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.401788 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "48be5fba-f61d-4475-bbaf-df6ece9da972" (UID: "48be5fba-f61d-4475-bbaf-df6ece9da972"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.404196 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "48be5fba-f61d-4475-bbaf-df6ece9da972" (UID: "48be5fba-f61d-4475-bbaf-df6ece9da972"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.406357 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-inventory" (OuterVolumeSpecName: "inventory") pod "48be5fba-f61d-4475-bbaf-df6ece9da972" (UID: "48be5fba-f61d-4475-bbaf-df6ece9da972"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.407955 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "48be5fba-f61d-4475-bbaf-df6ece9da972" (UID: "48be5fba-f61d-4475-bbaf-df6ece9da972"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.470394 4830 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.470435 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwkq4\" (UniqueName: \"kubernetes.io/projected/48be5fba-f61d-4475-bbaf-df6ece9da972-kube-api-access-lwkq4\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.470451 4830 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.470467 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.470477 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.470488 4830 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/48be5fba-f61d-4475-bbaf-df6ece9da972-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.556360 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553710-wzsnn" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.864618 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" event={"ID":"48be5fba-f61d-4475-bbaf-df6ece9da972","Type":"ContainerDied","Data":"ce06127508f116dc38549cd2c949f96a5a6ef6f653d77b6af0b71d418d003b80"} Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.864678 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.864669 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce06127508f116dc38549cd2c949f96a5a6ef6f653d77b6af0b71d418d003b80" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.952183 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq"] Mar 11 09:50:00 crc kubenswrapper[4830]: E0311 09:50:00.952491 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48be5fba-f61d-4475-bbaf-df6ece9da972" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.952506 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="48be5fba-f61d-4475-bbaf-df6ece9da972" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.952748 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="48be5fba-f61d-4475-bbaf-df6ece9da972" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.953578 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.955619 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.958883 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.959079 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.959188 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.959311 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:50:00 crc kubenswrapper[4830]: I0311 09:50:00.960107 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq"] Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.006256 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553710-wzsnn"] Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.085851 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xb2\" (UniqueName: \"kubernetes.io/projected/bfdc6f64-813a-4a57-a123-b4d15c6ae569-kube-api-access-h6xb2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.085933 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.085969 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.086032 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.086507 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.188318 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xb2\" (UniqueName: \"kubernetes.io/projected/bfdc6f64-813a-4a57-a123-b4d15c6ae569-kube-api-access-h6xb2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.188387 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.188457 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.188507 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.188631 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.194619 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.194646 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.195363 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.196834 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.209524 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xb2\" (UniqueName: \"kubernetes.io/projected/bfdc6f64-813a-4a57-a123-b4d15c6ae569-kube-api-access-h6xb2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.276168 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:50:01 crc kubenswrapper[4830]: W0311 09:50:01.793854 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfdc6f64_813a_4a57_a123_b4d15c6ae569.slice/crio-722a778007f9721838f096e8465969ea65d8fbffa18a344ca2b74e62fbba0d68 WatchSource:0}: Error finding container 722a778007f9721838f096e8465969ea65d8fbffa18a344ca2b74e62fbba0d68: Status 404 returned error can't find the container with id 722a778007f9721838f096e8465969ea65d8fbffa18a344ca2b74e62fbba0d68 Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.794248 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq"] Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.880790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553710-wzsnn" event={"ID":"777ba606-ef62-40e8-8c74-59538066c64f","Type":"ContainerStarted","Data":"4c30b74ed862463f5f7696c5c8263411d7f25044f33d5cda5bf350626f86dc97"} Mar 11 09:50:01 crc kubenswrapper[4830]: I0311 09:50:01.882748 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" event={"ID":"bfdc6f64-813a-4a57-a123-b4d15c6ae569","Type":"ContainerStarted","Data":"722a778007f9721838f096e8465969ea65d8fbffa18a344ca2b74e62fbba0d68"} Mar 11 09:50:02 crc kubenswrapper[4830]: I0311 09:50:02.893105 4830 generic.go:334] "Generic (PLEG): container finished" podID="777ba606-ef62-40e8-8c74-59538066c64f" containerID="3bd92cd6e9689468af8f75daedb522e0361f0613a25b26cbe5a0c8b3afb6223d" exitCode=0 Mar 11 09:50:02 crc kubenswrapper[4830]: I0311 09:50:02.893335 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553710-wzsnn" event={"ID":"777ba606-ef62-40e8-8c74-59538066c64f","Type":"ContainerDied","Data":"3bd92cd6e9689468af8f75daedb522e0361f0613a25b26cbe5a0c8b3afb6223d"} Mar 11 09:50:02 crc kubenswrapper[4830]: I0311 09:50:02.901272 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" event={"ID":"bfdc6f64-813a-4a57-a123-b4d15c6ae569","Type":"ContainerStarted","Data":"a1be9b6da457602daaf573824ff52181445e2c1c96862ad30320eedb179fdc8c"} Mar 11 09:50:02 crc kubenswrapper[4830]: I0311 09:50:02.924169 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" podStartSLOduration=2.423574797 podStartE2EDuration="2.924151905s" podCreationTimestamp="2026-03-11 09:50:00 +0000 UTC" firstStartedPulling="2026-03-11 09:50:01.796337267 +0000 UTC m=+2169.577487956" lastFinishedPulling="2026-03-11 09:50:02.296914385 +0000 UTC m=+2170.078065064" observedRunningTime="2026-03-11 09:50:02.921952535 +0000 UTC m=+2170.703103224" watchObservedRunningTime="2026-03-11 09:50:02.924151905 +0000 UTC m=+2170.705302584" Mar 11 09:50:04 crc kubenswrapper[4830]: I0311 09:50:04.213942 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553710-wzsnn" Mar 11 09:50:04 crc kubenswrapper[4830]: I0311 09:50:04.340912 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgfhv\" (UniqueName: \"kubernetes.io/projected/777ba606-ef62-40e8-8c74-59538066c64f-kube-api-access-vgfhv\") pod \"777ba606-ef62-40e8-8c74-59538066c64f\" (UID: \"777ba606-ef62-40e8-8c74-59538066c64f\") " Mar 11 09:50:04 crc kubenswrapper[4830]: I0311 09:50:04.347208 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777ba606-ef62-40e8-8c74-59538066c64f-kube-api-access-vgfhv" (OuterVolumeSpecName: "kube-api-access-vgfhv") pod "777ba606-ef62-40e8-8c74-59538066c64f" (UID: "777ba606-ef62-40e8-8c74-59538066c64f"). InnerVolumeSpecName "kube-api-access-vgfhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:50:04 crc kubenswrapper[4830]: I0311 09:50:04.442560 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgfhv\" (UniqueName: \"kubernetes.io/projected/777ba606-ef62-40e8-8c74-59538066c64f-kube-api-access-vgfhv\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:04 crc kubenswrapper[4830]: I0311 09:50:04.920403 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553710-wzsnn" event={"ID":"777ba606-ef62-40e8-8c74-59538066c64f","Type":"ContainerDied","Data":"4c30b74ed862463f5f7696c5c8263411d7f25044f33d5cda5bf350626f86dc97"} Mar 11 09:50:04 crc kubenswrapper[4830]: I0311 09:50:04.920515 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c30b74ed862463f5f7696c5c8263411d7f25044f33d5cda5bf350626f86dc97" Mar 11 09:50:04 crc kubenswrapper[4830]: I0311 09:50:04.920522 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553710-wzsnn" Mar 11 09:50:05 crc kubenswrapper[4830]: I0311 09:50:05.293370 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553704-n9t6z"] Mar 11 09:50:05 crc kubenswrapper[4830]: I0311 09:50:05.302306 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553704-n9t6z"] Mar 11 09:50:06 crc kubenswrapper[4830]: I0311 09:50:06.950285 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6fbeba2-4a06-4727-9bae-8470dc0b1c4e" path="/var/lib/kubelet/pods/f6fbeba2-4a06-4727-9bae-8470dc0b1c4e/volumes" Mar 11 09:50:12 crc kubenswrapper[4830]: I0311 09:50:12.275719 4830 scope.go:117] "RemoveContainer" containerID="ca9ed8e579f7430a790d3a430c56bdb363dc32dedc87ac44b3208285e8a1a731" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.264165 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8bv9v"] Mar 11 09:50:17 crc kubenswrapper[4830]: E0311 09:50:17.264976 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777ba606-ef62-40e8-8c74-59538066c64f" containerName="oc" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.264996 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="777ba606-ef62-40e8-8c74-59538066c64f" containerName="oc" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.265258 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="777ba606-ef62-40e8-8c74-59538066c64f" containerName="oc" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.266936 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.284696 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bv9v"] Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.438664 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rlm9\" (UniqueName: \"kubernetes.io/projected/cf09bbca-2035-41e1-b458-c69da0da1d8e-kube-api-access-2rlm9\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.438755 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-utilities\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.438876 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-catalog-content\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.540928 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rlm9\" (UniqueName: \"kubernetes.io/projected/cf09bbca-2035-41e1-b458-c69da0da1d8e-kube-api-access-2rlm9\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.540993 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-utilities\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.541053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-catalog-content\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.541733 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-catalog-content\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.541738 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-utilities\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.565245 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rlm9\" (UniqueName: \"kubernetes.io/projected/cf09bbca-2035-41e1-b458-c69da0da1d8e-kube-api-access-2rlm9\") pod \"redhat-operators-8bv9v\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:17 crc kubenswrapper[4830]: I0311 09:50:17.599994 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:18 crc kubenswrapper[4830]: I0311 09:50:18.097275 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8bv9v"] Mar 11 09:50:19 crc kubenswrapper[4830]: I0311 09:50:19.046566 4830 generic.go:334] "Generic (PLEG): container finished" podID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerID="ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f" exitCode=0 Mar 11 09:50:19 crc kubenswrapper[4830]: I0311 09:50:19.046628 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bv9v" event={"ID":"cf09bbca-2035-41e1-b458-c69da0da1d8e","Type":"ContainerDied","Data":"ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f"} Mar 11 09:50:19 crc kubenswrapper[4830]: I0311 09:50:19.046810 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bv9v" event={"ID":"cf09bbca-2035-41e1-b458-c69da0da1d8e","Type":"ContainerStarted","Data":"2a52c604046c6515cf30ef743457e41021eb96e6f32371688cc17ec80e709475"} Mar 11 09:50:19 crc kubenswrapper[4830]: I0311 09:50:19.048447 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.641504 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g62l4"] Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.645658 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.659501 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g62l4"] Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.831770 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-catalog-content\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.831946 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj2r\" (UniqueName: \"kubernetes.io/projected/c2bcebfb-5c0c-4f12-9155-780fba4fd885-kube-api-access-5zj2r\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.832031 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-utilities\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.934847 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-catalog-content\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.935067 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj2r\" (UniqueName: \"kubernetes.io/projected/c2bcebfb-5c0c-4f12-9155-780fba4fd885-kube-api-access-5zj2r\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.935152 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-utilities\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.936733 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-utilities\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.937141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-catalog-content\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:21 crc kubenswrapper[4830]: I0311 09:50:21.963573 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj2r\" (UniqueName: \"kubernetes.io/projected/c2bcebfb-5c0c-4f12-9155-780fba4fd885-kube-api-access-5zj2r\") pod \"redhat-marketplace-g62l4\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:22 crc kubenswrapper[4830]: I0311 09:50:22.003675 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:22 crc kubenswrapper[4830]: I0311 09:50:22.085167 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bv9v" event={"ID":"cf09bbca-2035-41e1-b458-c69da0da1d8e","Type":"ContainerStarted","Data":"9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3"} Mar 11 09:50:22 crc kubenswrapper[4830]: I0311 09:50:22.370074 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g62l4"] Mar 11 09:50:23 crc kubenswrapper[4830]: I0311 09:50:23.098033 4830 generic.go:334] "Generic (PLEG): container finished" podID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerID="00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e" exitCode=0 Mar 11 09:50:23 crc kubenswrapper[4830]: I0311 09:50:23.098088 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g62l4" event={"ID":"c2bcebfb-5c0c-4f12-9155-780fba4fd885","Type":"ContainerDied","Data":"00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e"} Mar 11 09:50:23 crc kubenswrapper[4830]: I0311 09:50:23.098140 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g62l4" event={"ID":"c2bcebfb-5c0c-4f12-9155-780fba4fd885","Type":"ContainerStarted","Data":"9adecd20fb95296310ae384a25192bc6752ef254ba38b55b388f08468b594323"} Mar 11 09:50:25 crc kubenswrapper[4830]: I0311 09:50:25.119202 4830 generic.go:334] "Generic (PLEG): container finished" podID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerID="49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1" exitCode=0 Mar 11 09:50:25 crc kubenswrapper[4830]: I0311 09:50:25.119249 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g62l4" event={"ID":"c2bcebfb-5c0c-4f12-9155-780fba4fd885","Type":"ContainerDied","Data":"49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1"} Mar 11 09:50:26 crc kubenswrapper[4830]: I0311 09:50:26.132444 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g62l4" event={"ID":"c2bcebfb-5c0c-4f12-9155-780fba4fd885","Type":"ContainerStarted","Data":"f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a"} Mar 11 09:50:26 crc kubenswrapper[4830]: I0311 09:50:26.155995 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g62l4" podStartSLOduration=2.700932309 podStartE2EDuration="5.155972267s" podCreationTimestamp="2026-03-11 09:50:21 +0000 UTC" firstStartedPulling="2026-03-11 09:50:23.101672077 +0000 UTC m=+2190.882822776" lastFinishedPulling="2026-03-11 09:50:25.556712045 +0000 UTC m=+2193.337862734" observedRunningTime="2026-03-11 09:50:26.148052832 +0000 UTC m=+2193.929203541" watchObservedRunningTime="2026-03-11 09:50:26.155972267 +0000 UTC m=+2193.937122956" Mar 11 09:50:28 crc kubenswrapper[4830]: I0311 09:50:28.153406 4830 generic.go:334] "Generic (PLEG): container finished" podID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerID="9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3" exitCode=0 Mar 11 09:50:28 crc kubenswrapper[4830]: I0311 09:50:28.153713 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bv9v" event={"ID":"cf09bbca-2035-41e1-b458-c69da0da1d8e","Type":"ContainerDied","Data":"9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3"} Mar 11 09:50:29 crc kubenswrapper[4830]: I0311 09:50:29.170648 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bv9v" event={"ID":"cf09bbca-2035-41e1-b458-c69da0da1d8e","Type":"ContainerStarted","Data":"e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f"} Mar 11 09:50:29 crc kubenswrapper[4830]: I0311 09:50:29.192456 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8bv9v" podStartSLOduration=2.612061838 podStartE2EDuration="12.192437585s" podCreationTimestamp="2026-03-11 09:50:17 +0000 UTC" firstStartedPulling="2026-03-11 09:50:19.048196405 +0000 UTC m=+2186.829347094" lastFinishedPulling="2026-03-11 09:50:28.628572152 +0000 UTC m=+2196.409722841" observedRunningTime="2026-03-11 09:50:29.187352227 +0000 UTC m=+2196.968502926" watchObservedRunningTime="2026-03-11 09:50:29.192437585 +0000 UTC m=+2196.973588274" Mar 11 09:50:31 crc kubenswrapper[4830]: I0311 09:50:31.966772 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7kjc8"] Mar 11 09:50:31 crc kubenswrapper[4830]: I0311 09:50:31.969648 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.001637 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7kjc8"] Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.005240 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.005769 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.060171 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.142449 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-catalog-content\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.142521 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vqh\" (UniqueName: \"kubernetes.io/projected/b6f5d566-b353-4d5c-a4f0-e1230e93721a-kube-api-access-68vqh\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.142585 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-utilities\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.244196 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vqh\" (UniqueName: \"kubernetes.io/projected/b6f5d566-b353-4d5c-a4f0-e1230e93721a-kube-api-access-68vqh\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.244534 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-utilities\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.244638 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-catalog-content\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.245176 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-catalog-content\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.245353 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-utilities\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.245863 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.270902 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vqh\" (UniqueName: \"kubernetes.io/projected/b6f5d566-b353-4d5c-a4f0-e1230e93721a-kube-api-access-68vqh\") pod \"certified-operators-7kjc8\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.290867 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:32 crc kubenswrapper[4830]: I0311 09:50:32.814575 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7kjc8"] Mar 11 09:50:32 crc kubenswrapper[4830]: W0311 09:50:32.815685 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f5d566_b353_4d5c_a4f0_e1230e93721a.slice/crio-c465f968dd4ca69cfc0aa3c46266198d5749ee1cfbee1359c70d9f5bf956cac2 WatchSource:0}: Error finding container c465f968dd4ca69cfc0aa3c46266198d5749ee1cfbee1359c70d9f5bf956cac2: Status 404 returned error can't find the container with id c465f968dd4ca69cfc0aa3c46266198d5749ee1cfbee1359c70d9f5bf956cac2 Mar 11 09:50:33 crc kubenswrapper[4830]: I0311 09:50:33.214453 4830 generic.go:334] "Generic (PLEG): container finished" podID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerID="b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded" exitCode=0 Mar 11 09:50:33 crc kubenswrapper[4830]: I0311 09:50:33.214498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjc8" event={"ID":"b6f5d566-b353-4d5c-a4f0-e1230e93721a","Type":"ContainerDied","Data":"b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded"} Mar 11 09:50:33 crc kubenswrapper[4830]: I0311 09:50:33.215461 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjc8" event={"ID":"b6f5d566-b353-4d5c-a4f0-e1230e93721a","Type":"ContainerStarted","Data":"c465f968dd4ca69cfc0aa3c46266198d5749ee1cfbee1359c70d9f5bf956cac2"} Mar 11 09:50:34 crc kubenswrapper[4830]: I0311 09:50:34.344866 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g62l4"] Mar 11 09:50:35 crc kubenswrapper[4830]: I0311 09:50:35.235801 4830 generic.go:334] "Generic (PLEG): container finished" podID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerID="da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959" exitCode=0 Mar 11 09:50:35 crc kubenswrapper[4830]: I0311 09:50:35.235849 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjc8" event={"ID":"b6f5d566-b353-4d5c-a4f0-e1230e93721a","Type":"ContainerDied","Data":"da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959"} Mar 11 09:50:35 crc kubenswrapper[4830]: I0311 09:50:35.236493 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g62l4" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="registry-server" containerID="cri-o://f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a" gracePeriod=2 Mar 11 09:50:35 crc kubenswrapper[4830]: I0311 09:50:35.871283 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.014639 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zj2r\" (UniqueName: \"kubernetes.io/projected/c2bcebfb-5c0c-4f12-9155-780fba4fd885-kube-api-access-5zj2r\") pod \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.014908 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-catalog-content\") pod \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.014940 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-utilities\") pod \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\" (UID: \"c2bcebfb-5c0c-4f12-9155-780fba4fd885\") " Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.015849 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-utilities" (OuterVolumeSpecName: "utilities") pod "c2bcebfb-5c0c-4f12-9155-780fba4fd885" (UID: "c2bcebfb-5c0c-4f12-9155-780fba4fd885"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.035997 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2bcebfb-5c0c-4f12-9155-780fba4fd885-kube-api-access-5zj2r" (OuterVolumeSpecName: "kube-api-access-5zj2r") pod "c2bcebfb-5c0c-4f12-9155-780fba4fd885" (UID: "c2bcebfb-5c0c-4f12-9155-780fba4fd885"). InnerVolumeSpecName "kube-api-access-5zj2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.057912 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2bcebfb-5c0c-4f12-9155-780fba4fd885" (UID: "c2bcebfb-5c0c-4f12-9155-780fba4fd885"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.118116 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.118153 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2bcebfb-5c0c-4f12-9155-780fba4fd885-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.118166 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zj2r\" (UniqueName: \"kubernetes.io/projected/c2bcebfb-5c0c-4f12-9155-780fba4fd885-kube-api-access-5zj2r\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.247392 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjc8" event={"ID":"b6f5d566-b353-4d5c-a4f0-e1230e93721a","Type":"ContainerStarted","Data":"d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa"} Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.249737 4830 generic.go:334] "Generic (PLEG): container finished" podID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerID="f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a" exitCode=0 Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.249785 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g62l4" event={"ID":"c2bcebfb-5c0c-4f12-9155-780fba4fd885","Type":"ContainerDied","Data":"f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a"} Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.249812 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g62l4" event={"ID":"c2bcebfb-5c0c-4f12-9155-780fba4fd885","Type":"ContainerDied","Data":"9adecd20fb95296310ae384a25192bc6752ef254ba38b55b388f08468b594323"} Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.249820 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g62l4" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.249830 4830 scope.go:117] "RemoveContainer" containerID="f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.274026 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7kjc8" podStartSLOduration=2.77439794 podStartE2EDuration="5.273986346s" podCreationTimestamp="2026-03-11 09:50:31 +0000 UTC" firstStartedPulling="2026-03-11 09:50:33.217964299 +0000 UTC m=+2200.999114988" lastFinishedPulling="2026-03-11 09:50:35.717552705 +0000 UTC m=+2203.498703394" observedRunningTime="2026-03-11 09:50:36.267775688 +0000 UTC m=+2204.048926387" watchObservedRunningTime="2026-03-11 09:50:36.273986346 +0000 UTC m=+2204.055137035" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.274639 4830 scope.go:117] "RemoveContainer" containerID="49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.293743 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g62l4"] Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.301155 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g62l4"] Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.317196 4830 scope.go:117] "RemoveContainer" containerID="00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.344516 4830 scope.go:117] "RemoveContainer" containerID="f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a" Mar 11 09:50:36 crc kubenswrapper[4830]: E0311 09:50:36.345069 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a\": container with ID starting with f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a not found: ID does not exist" containerID="f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.345109 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a"} err="failed to get container status \"f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a\": rpc error: code = NotFound desc = could not find container \"f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a\": container with ID starting with f949a18206eaa5c7eca0da18e963742dc7fc88d42f22d2055b69caabddf4598a not found: ID does not exist" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.345136 4830 scope.go:117] "RemoveContainer" containerID="49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1" Mar 11 09:50:36 crc kubenswrapper[4830]: E0311 09:50:36.345576 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1\": container with ID starting with 49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1 not found: ID does not exist" containerID="49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.345819 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1"} err="failed to get container status \"49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1\": rpc error: code = NotFound desc = could not find container \"49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1\": container with ID starting with 49742de5e0ea526f91674639f3e0b9259c1fc21571c8d66712c60841614732f1 not found: ID does not exist" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.345845 4830 scope.go:117] "RemoveContainer" containerID="00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e" Mar 11 09:50:36 crc kubenswrapper[4830]: E0311 09:50:36.346272 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e\": container with ID starting with 00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e not found: ID does not exist" containerID="00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.346304 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e"} err="failed to get container status \"00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e\": rpc error: code = NotFound desc = could not find container \"00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e\": container with ID starting with 00b267d1e90f12bcd8591f89e784b49bf392bf789aa6b295c617009cdf8cb90e not found: ID does not exist" Mar 11 09:50:36 crc kubenswrapper[4830]: I0311 09:50:36.943440 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" path="/var/lib/kubelet/pods/c2bcebfb-5c0c-4f12-9155-780fba4fd885/volumes" Mar 11 09:50:37 crc kubenswrapper[4830]: I0311 09:50:37.601084 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:37 crc kubenswrapper[4830]: I0311 09:50:37.601148 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:37 crc kubenswrapper[4830]: I0311 09:50:37.651519 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:38 crc kubenswrapper[4830]: I0311 09:50:38.311473 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:40 crc kubenswrapper[4830]: I0311 09:50:40.544383 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bv9v"] Mar 11 09:50:40 crc kubenswrapper[4830]: I0311 09:50:40.544937 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8bv9v" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="registry-server" containerID="cri-o://e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f" gracePeriod=2 Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.014217 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.117473 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rlm9\" (UniqueName: \"kubernetes.io/projected/cf09bbca-2035-41e1-b458-c69da0da1d8e-kube-api-access-2rlm9\") pod \"cf09bbca-2035-41e1-b458-c69da0da1d8e\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.117618 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-catalog-content\") pod \"cf09bbca-2035-41e1-b458-c69da0da1d8e\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.117787 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-utilities\") pod \"cf09bbca-2035-41e1-b458-c69da0da1d8e\" (UID: \"cf09bbca-2035-41e1-b458-c69da0da1d8e\") " Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.119215 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-utilities" (OuterVolumeSpecName: "utilities") pod "cf09bbca-2035-41e1-b458-c69da0da1d8e" (UID: "cf09bbca-2035-41e1-b458-c69da0da1d8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.124245 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf09bbca-2035-41e1-b458-c69da0da1d8e-kube-api-access-2rlm9" (OuterVolumeSpecName: "kube-api-access-2rlm9") pod "cf09bbca-2035-41e1-b458-c69da0da1d8e" (UID: "cf09bbca-2035-41e1-b458-c69da0da1d8e"). InnerVolumeSpecName "kube-api-access-2rlm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.221382 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rlm9\" (UniqueName: \"kubernetes.io/projected/cf09bbca-2035-41e1-b458-c69da0da1d8e-kube-api-access-2rlm9\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.221460 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.240062 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf09bbca-2035-41e1-b458-c69da0da1d8e" (UID: "cf09bbca-2035-41e1-b458-c69da0da1d8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.300186 4830 generic.go:334] "Generic (PLEG): container finished" podID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerID="e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f" exitCode=0 Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.300312 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8bv9v" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.300507 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bv9v" event={"ID":"cf09bbca-2035-41e1-b458-c69da0da1d8e","Type":"ContainerDied","Data":"e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f"} Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.300638 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8bv9v" event={"ID":"cf09bbca-2035-41e1-b458-c69da0da1d8e","Type":"ContainerDied","Data":"2a52c604046c6515cf30ef743457e41021eb96e6f32371688cc17ec80e709475"} Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.300753 4830 scope.go:117] "RemoveContainer" containerID="e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.323249 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf09bbca-2035-41e1-b458-c69da0da1d8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.327125 4830 scope.go:117] "RemoveContainer" containerID="9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.341265 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8bv9v"] Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.350941 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8bv9v"] Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.355339 4830 scope.go:117] "RemoveContainer" containerID="ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.394850 4830 scope.go:117] "RemoveContainer" containerID="e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f" Mar 11 09:50:41 crc kubenswrapper[4830]: E0311 09:50:41.395318 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f\": container with ID starting with e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f not found: ID does not exist" containerID="e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.395372 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f"} err="failed to get container status \"e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f\": rpc error: code = NotFound desc = could not find container \"e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f\": container with ID starting with e718232a6918647e33b095f4b311c6026ef1d2a0a19ad98838ac094206a85c6f not found: ID does not exist" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.395403 4830 scope.go:117] "RemoveContainer" containerID="9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3" Mar 11 09:50:41 crc kubenswrapper[4830]: E0311 09:50:41.397349 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3\": container with ID starting with 9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3 not found: ID does not exist" containerID="9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.397395 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3"} err="failed to get container status \"9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3\": rpc error: code = NotFound desc = could not find container \"9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3\": container with ID starting with 9897cb0b86ec2f4c9226b8b9d8d6a5e0ce933a92cd306dc39de692fb24ff7ff3 not found: ID does not exist" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.397428 4830 scope.go:117] "RemoveContainer" containerID="ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f" Mar 11 09:50:41 crc kubenswrapper[4830]: E0311 09:50:41.397756 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f\": container with ID starting with ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f not found: ID does not exist" containerID="ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f" Mar 11 09:50:41 crc kubenswrapper[4830]: I0311 09:50:41.397822 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f"} err="failed to get container status \"ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f\": rpc error: code = NotFound desc = could not find container \"ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f\": container with ID starting with ff2f6e3925e1a8c9e9459546c3d1c06de2fcc1e5e80db0ff43c561cc8ee8a32f not found: ID does not exist" Mar 11 09:50:42 crc kubenswrapper[4830]: I0311 09:50:42.290920 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:42 crc kubenswrapper[4830]: I0311 09:50:42.291225 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:42 crc kubenswrapper[4830]: I0311 09:50:42.343826 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:42 crc kubenswrapper[4830]: I0311 09:50:42.391113 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:42 crc kubenswrapper[4830]: I0311 09:50:42.944849 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" path="/var/lib/kubelet/pods/cf09bbca-2035-41e1-b458-c69da0da1d8e/volumes" Mar 11 09:50:43 crc kubenswrapper[4830]: I0311 09:50:43.060711 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:50:43 crc kubenswrapper[4830]: I0311 09:50:43.060789 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:50:43 crc kubenswrapper[4830]: I0311 09:50:43.744666 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7kjc8"] Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.327963 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7kjc8" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="registry-server" containerID="cri-o://d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa" gracePeriod=2 Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.768400 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.892949 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vqh\" (UniqueName: \"kubernetes.io/projected/b6f5d566-b353-4d5c-a4f0-e1230e93721a-kube-api-access-68vqh\") pod \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.893102 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-catalog-content\") pod \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.893436 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-utilities\") pod \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\" (UID: \"b6f5d566-b353-4d5c-a4f0-e1230e93721a\") " Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.894415 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-utilities" (OuterVolumeSpecName: "utilities") pod "b6f5d566-b353-4d5c-a4f0-e1230e93721a" (UID: "b6f5d566-b353-4d5c-a4f0-e1230e93721a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.898841 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f5d566-b353-4d5c-a4f0-e1230e93721a-kube-api-access-68vqh" (OuterVolumeSpecName: "kube-api-access-68vqh") pod "b6f5d566-b353-4d5c-a4f0-e1230e93721a" (UID: "b6f5d566-b353-4d5c-a4f0-e1230e93721a"). InnerVolumeSpecName "kube-api-access-68vqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.954949 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6f5d566-b353-4d5c-a4f0-e1230e93721a" (UID: "b6f5d566-b353-4d5c-a4f0-e1230e93721a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.995422 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.995460 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f5d566-b353-4d5c-a4f0-e1230e93721a-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:44 crc kubenswrapper[4830]: I0311 09:50:44.995471 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vqh\" (UniqueName: \"kubernetes.io/projected/b6f5d566-b353-4d5c-a4f0-e1230e93721a-kube-api-access-68vqh\") on node \"crc\" DevicePath \"\"" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.336918 4830 generic.go:334] "Generic (PLEG): container finished" podID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerID="d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa" exitCode=0 Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.336964 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjc8" event={"ID":"b6f5d566-b353-4d5c-a4f0-e1230e93721a","Type":"ContainerDied","Data":"d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa"} Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.336990 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjc8" event={"ID":"b6f5d566-b353-4d5c-a4f0-e1230e93721a","Type":"ContainerDied","Data":"c465f968dd4ca69cfc0aa3c46266198d5749ee1cfbee1359c70d9f5bf956cac2"} Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.337008 4830 scope.go:117] "RemoveContainer" containerID="d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.337041 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjc8" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.356749 4830 scope.go:117] "RemoveContainer" containerID="da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.369038 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7kjc8"] Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.377790 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7kjc8"] Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.395727 4830 scope.go:117] "RemoveContainer" containerID="b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.433271 4830 scope.go:117] "RemoveContainer" containerID="d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa" Mar 11 09:50:45 crc kubenswrapper[4830]: E0311 09:50:45.433667 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa\": container with ID starting with d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa not found: ID does not exist" containerID="d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.433728 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa"} err="failed to get container status \"d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa\": rpc error: code = NotFound desc = could not find container \"d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa\": container with ID starting with d2b09c408e0cd0f5f9e0244d07eb95f1ec125d64ffe5eaaa091e0bae42e589fa not found: ID does not exist" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.433760 4830 scope.go:117] "RemoveContainer" containerID="da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959" Mar 11 09:50:45 crc kubenswrapper[4830]: E0311 09:50:45.434037 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959\": container with ID starting with da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959 not found: ID does not exist" containerID="da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.434068 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959"} err="failed to get container status \"da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959\": rpc error: code = NotFound desc = could not find container \"da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959\": container with ID starting with da063e9c23c66de0b1b0df240042a2c45d6c3bd56922fd8b68d1f8409cc0c959 not found: ID does not exist" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.434085 4830 scope.go:117] "RemoveContainer" containerID="b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded" Mar 11 09:50:45 crc kubenswrapper[4830]: E0311 09:50:45.434960 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded\": container with ID starting with b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded not found: ID does not exist" containerID="b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded" Mar 11 09:50:45 crc kubenswrapper[4830]: I0311 09:50:45.434997 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded"} err="failed to get container status \"b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded\": rpc error: code = NotFound desc = could not find container \"b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded\": container with ID starting with b6849f6e5724cf0f60ba97fbc3989bdca8f1efa4ef52df8ad4db8fedbb211ded not found: ID does not exist" Mar 11 09:50:46 crc kubenswrapper[4830]: I0311 09:50:46.942098 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" path="/var/lib/kubelet/pods/b6f5d566-b353-4d5c-a4f0-e1230e93721a/volumes" Mar 11 09:51:13 crc kubenswrapper[4830]: I0311 09:51:13.060475 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:51:13 crc kubenswrapper[4830]: I0311 09:51:13.060972 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:51:43 crc kubenswrapper[4830]: I0311 09:51:43.060239 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:51:43 crc kubenswrapper[4830]: I0311 09:51:43.061050 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:51:43 crc kubenswrapper[4830]: I0311 09:51:43.061106 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 09:51:43 crc kubenswrapper[4830]: I0311 09:51:43.061923 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 09:51:43 crc kubenswrapper[4830]: I0311 09:51:43.061994 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" gracePeriod=600 Mar 11 09:51:43 crc kubenswrapper[4830]: E0311 09:51:43.249681 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:51:44 crc kubenswrapper[4830]: I0311 09:51:44.012211 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" exitCode=0 Mar 11 09:51:44 crc kubenswrapper[4830]: I0311 09:51:44.012255 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8"} Mar 11 09:51:44 crc kubenswrapper[4830]: I0311 09:51:44.012287 4830 scope.go:117] "RemoveContainer" containerID="4f3c625d98358eb5bb4ebc7964cb1866ae7af600501322ab72c5f9b2bdd25068" Mar 11 09:51:44 crc kubenswrapper[4830]: I0311 09:51:44.012941 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:51:44 crc kubenswrapper[4830]: E0311 09:51:44.013187 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:51:56 crc kubenswrapper[4830]: I0311 09:51:56.933644 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:51:56 crc kubenswrapper[4830]: E0311 09:51:56.934451 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.151772 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553712-5j9rz"] Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152535 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152547 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152561 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="extract-content" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152566 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="extract-content" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152576 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="extract-content" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152582 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="extract-content" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152593 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="extract-utilities" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152599 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="extract-utilities" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152609 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="extract-utilities" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152622 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="extract-utilities" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152637 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="extract-utilities" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152643 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="extract-utilities" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152661 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152667 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152683 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="extract-content" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152689 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="extract-content" Mar 11 09:52:00 crc kubenswrapper[4830]: E0311 09:52:00.152697 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152703 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152891 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2bcebfb-5c0c-4f12-9155-780fba4fd885" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152904 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f5d566-b353-4d5c-a4f0-e1230e93721a" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.152920 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf09bbca-2035-41e1-b458-c69da0da1d8e" containerName="registry-server" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.153676 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553712-5j9rz" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.156047 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.156400 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.156751 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.165971 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553712-5j9rz"] Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.171592 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctfmh\" (UniqueName: \"kubernetes.io/projected/37d876e5-5e9a-49d5-8bf4-306593bcb686-kube-api-access-ctfmh\") pod \"auto-csr-approver-29553712-5j9rz\" (UID: \"37d876e5-5e9a-49d5-8bf4-306593bcb686\") " pod="openshift-infra/auto-csr-approver-29553712-5j9rz" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.273288 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctfmh\" (UniqueName: \"kubernetes.io/projected/37d876e5-5e9a-49d5-8bf4-306593bcb686-kube-api-access-ctfmh\") pod \"auto-csr-approver-29553712-5j9rz\" (UID: \"37d876e5-5e9a-49d5-8bf4-306593bcb686\") " pod="openshift-infra/auto-csr-approver-29553712-5j9rz" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.295233 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctfmh\" (UniqueName: \"kubernetes.io/projected/37d876e5-5e9a-49d5-8bf4-306593bcb686-kube-api-access-ctfmh\") pod \"auto-csr-approver-29553712-5j9rz\" (UID: \"37d876e5-5e9a-49d5-8bf4-306593bcb686\") " pod="openshift-infra/auto-csr-approver-29553712-5j9rz" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.478820 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553712-5j9rz" Mar 11 09:52:00 crc kubenswrapper[4830]: I0311 09:52:00.929502 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553712-5j9rz"] Mar 11 09:52:01 crc kubenswrapper[4830]: I0311 09:52:01.287486 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553712-5j9rz" event={"ID":"37d876e5-5e9a-49d5-8bf4-306593bcb686","Type":"ContainerStarted","Data":"02bf652b0beb8e5e5e92e279be3a5fa96968113a507ea94a64794ed53df97a8b"} Mar 11 09:52:02 crc kubenswrapper[4830]: I0311 09:52:02.300152 4830 generic.go:334] "Generic (PLEG): container finished" podID="37d876e5-5e9a-49d5-8bf4-306593bcb686" containerID="293c557ec8fcb06fcacad5f57473e6648a62082d73fbb9933a969f1227b135fe" exitCode=0 Mar 11 09:52:02 crc kubenswrapper[4830]: I0311 09:52:02.300367 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553712-5j9rz" event={"ID":"37d876e5-5e9a-49d5-8bf4-306593bcb686","Type":"ContainerDied","Data":"293c557ec8fcb06fcacad5f57473e6648a62082d73fbb9933a969f1227b135fe"} Mar 11 09:52:03 crc kubenswrapper[4830]: I0311 09:52:03.599609 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553712-5j9rz" Mar 11 09:52:03 crc kubenswrapper[4830]: I0311 09:52:03.739503 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctfmh\" (UniqueName: \"kubernetes.io/projected/37d876e5-5e9a-49d5-8bf4-306593bcb686-kube-api-access-ctfmh\") pod \"37d876e5-5e9a-49d5-8bf4-306593bcb686\" (UID: \"37d876e5-5e9a-49d5-8bf4-306593bcb686\") " Mar 11 09:52:03 crc kubenswrapper[4830]: I0311 09:52:03.745235 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d876e5-5e9a-49d5-8bf4-306593bcb686-kube-api-access-ctfmh" (OuterVolumeSpecName: "kube-api-access-ctfmh") pod "37d876e5-5e9a-49d5-8bf4-306593bcb686" (UID: "37d876e5-5e9a-49d5-8bf4-306593bcb686"). InnerVolumeSpecName "kube-api-access-ctfmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:52:03 crc kubenswrapper[4830]: I0311 09:52:03.841470 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctfmh\" (UniqueName: \"kubernetes.io/projected/37d876e5-5e9a-49d5-8bf4-306593bcb686-kube-api-access-ctfmh\") on node \"crc\" DevicePath \"\"" Mar 11 09:52:04 crc kubenswrapper[4830]: I0311 09:52:04.318979 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553712-5j9rz" event={"ID":"37d876e5-5e9a-49d5-8bf4-306593bcb686","Type":"ContainerDied","Data":"02bf652b0beb8e5e5e92e279be3a5fa96968113a507ea94a64794ed53df97a8b"} Mar 11 09:52:04 crc kubenswrapper[4830]: I0311 09:52:04.319051 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02bf652b0beb8e5e5e92e279be3a5fa96968113a507ea94a64794ed53df97a8b" Mar 11 09:52:04 crc kubenswrapper[4830]: I0311 09:52:04.319051 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553712-5j9rz" Mar 11 09:52:04 crc kubenswrapper[4830]: I0311 09:52:04.670271 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553706-5cq7l"] Mar 11 09:52:04 crc kubenswrapper[4830]: I0311 09:52:04.677957 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553706-5cq7l"] Mar 11 09:52:04 crc kubenswrapper[4830]: I0311 09:52:04.944270 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb31f53a-31eb-4cfc-90c4-f3a508e746b3" path="/var/lib/kubelet/pods/bb31f53a-31eb-4cfc-90c4-f3a508e746b3/volumes" Mar 11 09:52:08 crc kubenswrapper[4830]: I0311 09:52:08.933666 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:52:08 crc kubenswrapper[4830]: E0311 09:52:08.934187 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:52:12 crc kubenswrapper[4830]: I0311 09:52:12.429917 4830 scope.go:117] "RemoveContainer" containerID="b7eff5f823898959b976ca74d8e087b7766236f5627c5de44f119c5f40f9317e" Mar 11 09:52:22 crc kubenswrapper[4830]: I0311 09:52:22.938440 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:52:22 crc kubenswrapper[4830]: E0311 09:52:22.939318 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:52:33 crc kubenswrapper[4830]: I0311 09:52:33.932474 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:52:33 crc kubenswrapper[4830]: E0311 09:52:33.933232 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:52:44 crc kubenswrapper[4830]: I0311 09:52:44.933290 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:52:44 crc kubenswrapper[4830]: E0311 09:52:44.934185 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:52:58 crc kubenswrapper[4830]: I0311 09:52:58.933319 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:52:58 crc kubenswrapper[4830]: E0311 09:52:58.934239 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:53:09 crc kubenswrapper[4830]: I0311 09:53:09.934255 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:53:09 crc kubenswrapper[4830]: E0311 09:53:09.934936 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:53:23 crc kubenswrapper[4830]: I0311 09:53:23.932742 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:53:23 crc kubenswrapper[4830]: E0311 09:53:23.934836 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:53:38 crc kubenswrapper[4830]: I0311 09:53:38.933516 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:53:38 crc kubenswrapper[4830]: E0311 09:53:38.934337 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:53:42 crc kubenswrapper[4830]: I0311 09:53:42.165133 4830 generic.go:334] "Generic (PLEG): container finished" podID="bfdc6f64-813a-4a57-a123-b4d15c6ae569" containerID="a1be9b6da457602daaf573824ff52181445e2c1c96862ad30320eedb179fdc8c" exitCode=0 Mar 11 09:53:42 crc kubenswrapper[4830]: I0311 09:53:42.165220 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" event={"ID":"bfdc6f64-813a-4a57-a123-b4d15c6ae569","Type":"ContainerDied","Data":"a1be9b6da457602daaf573824ff52181445e2c1c96862ad30320eedb179fdc8c"} Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.561605 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.686399 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-secret-0\") pod \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.686767 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-combined-ca-bundle\") pod \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.686877 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-ssh-key-openstack-edpm-ipam\") pod \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.686912 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xb2\" (UniqueName: \"kubernetes.io/projected/bfdc6f64-813a-4a57-a123-b4d15c6ae569-kube-api-access-h6xb2\") pod \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.687063 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-inventory\") pod \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\" (UID: \"bfdc6f64-813a-4a57-a123-b4d15c6ae569\") " Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.693053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "bfdc6f64-813a-4a57-a123-b4d15c6ae569" (UID: "bfdc6f64-813a-4a57-a123-b4d15c6ae569"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.697005 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfdc6f64-813a-4a57-a123-b4d15c6ae569-kube-api-access-h6xb2" (OuterVolumeSpecName: "kube-api-access-h6xb2") pod "bfdc6f64-813a-4a57-a123-b4d15c6ae569" (UID: "bfdc6f64-813a-4a57-a123-b4d15c6ae569"). InnerVolumeSpecName "kube-api-access-h6xb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.717154 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "bfdc6f64-813a-4a57-a123-b4d15c6ae569" (UID: "bfdc6f64-813a-4a57-a123-b4d15c6ae569"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.717478 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bfdc6f64-813a-4a57-a123-b4d15c6ae569" (UID: "bfdc6f64-813a-4a57-a123-b4d15c6ae569"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.719299 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-inventory" (OuterVolumeSpecName: "inventory") pod "bfdc6f64-813a-4a57-a123-b4d15c6ae569" (UID: "bfdc6f64-813a-4a57-a123-b4d15c6ae569"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.789057 4830 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.789319 4830 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.789409 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.789484 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xb2\" (UniqueName: \"kubernetes.io/projected/bfdc6f64-813a-4a57-a123-b4d15c6ae569-kube-api-access-h6xb2\") on node \"crc\" DevicePath \"\"" Mar 11 09:53:43 crc kubenswrapper[4830]: I0311 09:53:43.789548 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfdc6f64-813a-4a57-a123-b4d15c6ae569-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.185444 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" event={"ID":"bfdc6f64-813a-4a57-a123-b4d15c6ae569","Type":"ContainerDied","Data":"722a778007f9721838f096e8465969ea65d8fbffa18a344ca2b74e62fbba0d68"} Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.185496 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="722a778007f9721838f096e8465969ea65d8fbffa18a344ca2b74e62fbba0d68" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.185562 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.296871 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm"] Mar 11 09:53:44 crc kubenswrapper[4830]: E0311 09:53:44.297346 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d876e5-5e9a-49d5-8bf4-306593bcb686" containerName="oc" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.297364 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d876e5-5e9a-49d5-8bf4-306593bcb686" containerName="oc" Mar 11 09:53:44 crc kubenswrapper[4830]: E0311 09:53:44.297384 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdc6f64-813a-4a57-a123-b4d15c6ae569" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.297391 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdc6f64-813a-4a57-a123-b4d15c6ae569" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.297564 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfdc6f64-813a-4a57-a123-b4d15c6ae569" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.297577 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d876e5-5e9a-49d5-8bf4-306593bcb686" containerName="oc" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.298334 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.302623 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.306451 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm"] Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.308234 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.308539 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.308561 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.308773 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.309174 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.312606 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:53:44 crc kubenswrapper[4830]: E0311 09:53:44.378300 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfdc6f64_813a_4a57_a123_b4d15c6ae569.slice/crio-722a778007f9721838f096e8465969ea65d8fbffa18a344ca2b74e62fbba0d68\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfdc6f64_813a_4a57_a123_b4d15c6ae569.slice\": RecentStats: unable to find data in memory cache]" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.401579 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.401958 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5gwd\" (UniqueName: \"kubernetes.io/projected/df44fa5f-956c-47f8-af60-49a95e1c6da1-kube-api-access-n5gwd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402117 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402222 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402324 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402479 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402620 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402809 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.402900 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.403076 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505210 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5gwd\" (UniqueName: \"kubernetes.io/projected/df44fa5f-956c-47f8-af60-49a95e1c6da1-kube-api-access-n5gwd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505558 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505585 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505611 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505667 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505689 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505713 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505779 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505827 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505867 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.505957 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.507241 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.509400 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.510415 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.510719 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.510760 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.510952 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.511179 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.512794 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.519473 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.521330 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.525819 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5gwd\" (UniqueName: \"kubernetes.io/projected/df44fa5f-956c-47f8-af60-49a95e1c6da1-kube-api-access-n5gwd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-79mrm\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:44 crc kubenswrapper[4830]: I0311 09:53:44.621099 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:53:45 crc kubenswrapper[4830]: I0311 09:53:45.160485 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm"] Mar 11 09:53:45 crc kubenswrapper[4830]: I0311 09:53:45.198519 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" event={"ID":"df44fa5f-956c-47f8-af60-49a95e1c6da1","Type":"ContainerStarted","Data":"4fd8b314bd9604f8fd07e1989c41f1f247d0f8847843acf8af7d22ee3b429be1"} Mar 11 09:53:46 crc kubenswrapper[4830]: I0311 09:53:46.209867 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" event={"ID":"df44fa5f-956c-47f8-af60-49a95e1c6da1","Type":"ContainerStarted","Data":"c553316dff92f7b67da6bed0b2418d74941101ecfd7f0babf17683d36846be94"} Mar 11 09:53:46 crc kubenswrapper[4830]: I0311 09:53:46.248647 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" podStartSLOduration=1.71438309 podStartE2EDuration="2.248629549s" podCreationTimestamp="2026-03-11 09:53:44 +0000 UTC" firstStartedPulling="2026-03-11 09:53:45.170406646 +0000 UTC m=+2392.951557335" lastFinishedPulling="2026-03-11 09:53:45.704653105 +0000 UTC m=+2393.485803794" observedRunningTime="2026-03-11 09:53:46.242601965 +0000 UTC m=+2394.023752674" watchObservedRunningTime="2026-03-11 09:53:46.248629549 +0000 UTC m=+2394.029780228" Mar 11 09:53:50 crc kubenswrapper[4830]: I0311 09:53:50.934394 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:53:50 crc kubenswrapper[4830]: E0311 09:53:50.935073 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.152868 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553714-zbghn"] Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.155782 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553714-zbghn" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.160011 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.160092 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.163920 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.168278 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553714-zbghn"] Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.251797 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnhsw\" (UniqueName: \"kubernetes.io/projected/338ffab6-8c72-40aa-b2b6-582c35164f2a-kube-api-access-jnhsw\") pod \"auto-csr-approver-29553714-zbghn\" (UID: \"338ffab6-8c72-40aa-b2b6-582c35164f2a\") " pod="openshift-infra/auto-csr-approver-29553714-zbghn" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.354283 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnhsw\" (UniqueName: \"kubernetes.io/projected/338ffab6-8c72-40aa-b2b6-582c35164f2a-kube-api-access-jnhsw\") pod \"auto-csr-approver-29553714-zbghn\" (UID: \"338ffab6-8c72-40aa-b2b6-582c35164f2a\") " pod="openshift-infra/auto-csr-approver-29553714-zbghn" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.374667 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnhsw\" (UniqueName: \"kubernetes.io/projected/338ffab6-8c72-40aa-b2b6-582c35164f2a-kube-api-access-jnhsw\") pod \"auto-csr-approver-29553714-zbghn\" (UID: \"338ffab6-8c72-40aa-b2b6-582c35164f2a\") " pod="openshift-infra/auto-csr-approver-29553714-zbghn" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.482524 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553714-zbghn" Mar 11 09:54:00 crc kubenswrapper[4830]: I0311 09:54:00.957346 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553714-zbghn"] Mar 11 09:54:01 crc kubenswrapper[4830]: I0311 09:54:01.385801 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553714-zbghn" event={"ID":"338ffab6-8c72-40aa-b2b6-582c35164f2a","Type":"ContainerStarted","Data":"1299b1e525dceecfbbcef66b347daafc7145159d5a9c2ac675e043ae544d86e5"} Mar 11 09:54:01 crc kubenswrapper[4830]: I0311 09:54:01.934252 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:54:01 crc kubenswrapper[4830]: E0311 09:54:01.935203 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:54:03 crc kubenswrapper[4830]: I0311 09:54:03.403670 4830 generic.go:334] "Generic (PLEG): container finished" podID="338ffab6-8c72-40aa-b2b6-582c35164f2a" containerID="92720a5af4102d3f119c4a6e93e10dfc968c09a8724c3beba6d5ddd3c34046d5" exitCode=0 Mar 11 09:54:03 crc kubenswrapper[4830]: I0311 09:54:03.403729 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553714-zbghn" event={"ID":"338ffab6-8c72-40aa-b2b6-582c35164f2a","Type":"ContainerDied","Data":"92720a5af4102d3f119c4a6e93e10dfc968c09a8724c3beba6d5ddd3c34046d5"} Mar 11 09:54:04 crc kubenswrapper[4830]: I0311 09:54:04.755627 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553714-zbghn" Mar 11 09:54:04 crc kubenswrapper[4830]: I0311 09:54:04.853673 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnhsw\" (UniqueName: \"kubernetes.io/projected/338ffab6-8c72-40aa-b2b6-582c35164f2a-kube-api-access-jnhsw\") pod \"338ffab6-8c72-40aa-b2b6-582c35164f2a\" (UID: \"338ffab6-8c72-40aa-b2b6-582c35164f2a\") " Mar 11 09:54:04 crc kubenswrapper[4830]: I0311 09:54:04.861401 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/338ffab6-8c72-40aa-b2b6-582c35164f2a-kube-api-access-jnhsw" (OuterVolumeSpecName: "kube-api-access-jnhsw") pod "338ffab6-8c72-40aa-b2b6-582c35164f2a" (UID: "338ffab6-8c72-40aa-b2b6-582c35164f2a"). InnerVolumeSpecName "kube-api-access-jnhsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:54:04 crc kubenswrapper[4830]: I0311 09:54:04.956003 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnhsw\" (UniqueName: \"kubernetes.io/projected/338ffab6-8c72-40aa-b2b6-582c35164f2a-kube-api-access-jnhsw\") on node \"crc\" DevicePath \"\"" Mar 11 09:54:05 crc kubenswrapper[4830]: I0311 09:54:05.420561 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553714-zbghn" Mar 11 09:54:05 crc kubenswrapper[4830]: I0311 09:54:05.420551 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553714-zbghn" event={"ID":"338ffab6-8c72-40aa-b2b6-582c35164f2a","Type":"ContainerDied","Data":"1299b1e525dceecfbbcef66b347daafc7145159d5a9c2ac675e043ae544d86e5"} Mar 11 09:54:05 crc kubenswrapper[4830]: I0311 09:54:05.420964 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1299b1e525dceecfbbcef66b347daafc7145159d5a9c2ac675e043ae544d86e5" Mar 11 09:54:05 crc kubenswrapper[4830]: I0311 09:54:05.836314 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553708-ctxh8"] Mar 11 09:54:05 crc kubenswrapper[4830]: I0311 09:54:05.848191 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553708-ctxh8"] Mar 11 09:54:06 crc kubenswrapper[4830]: I0311 09:54:06.943282 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b64cb553-4d78-4d92-be2d-191073aaa5e5" path="/var/lib/kubelet/pods/b64cb553-4d78-4d92-be2d-191073aaa5e5/volumes" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.021242 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6qrdh"] Mar 11 09:54:10 crc kubenswrapper[4830]: E0311 09:54:10.022223 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338ffab6-8c72-40aa-b2b6-582c35164f2a" containerName="oc" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.022239 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="338ffab6-8c72-40aa-b2b6-582c35164f2a" containerName="oc" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.022487 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="338ffab6-8c72-40aa-b2b6-582c35164f2a" containerName="oc" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.032823 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.058094 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qrdh"] Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.163440 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-utilities\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.163514 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-catalog-content\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.163550 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxks\" (UniqueName: \"kubernetes.io/projected/b7b791f3-4c95-42e3-802d-7c12b1e20f74-kube-api-access-9sxks\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.266664 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-utilities\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.266773 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-catalog-content\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.266830 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxks\" (UniqueName: \"kubernetes.io/projected/b7b791f3-4c95-42e3-802d-7c12b1e20f74-kube-api-access-9sxks\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.267419 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-utilities\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.267508 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-catalog-content\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.287957 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxks\" (UniqueName: \"kubernetes.io/projected/b7b791f3-4c95-42e3-802d-7c12b1e20f74-kube-api-access-9sxks\") pod \"community-operators-6qrdh\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.355277 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:10 crc kubenswrapper[4830]: I0311 09:54:10.858732 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qrdh"] Mar 11 09:54:11 crc kubenswrapper[4830]: I0311 09:54:11.479883 4830 generic.go:334] "Generic (PLEG): container finished" podID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerID="d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4" exitCode=0 Mar 11 09:54:11 crc kubenswrapper[4830]: I0311 09:54:11.479971 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qrdh" event={"ID":"b7b791f3-4c95-42e3-802d-7c12b1e20f74","Type":"ContainerDied","Data":"d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4"} Mar 11 09:54:11 crc kubenswrapper[4830]: I0311 09:54:11.480292 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qrdh" event={"ID":"b7b791f3-4c95-42e3-802d-7c12b1e20f74","Type":"ContainerStarted","Data":"502a438f762d7d4d01ee92ff64d9ed8fc17480c24d1c3f43d7811fbcb52e3cab"} Mar 11 09:54:12 crc kubenswrapper[4830]: I0311 09:54:12.512266 4830 scope.go:117] "RemoveContainer" containerID="90e65b1c2d89a74fcafd8f1e9429f4aa37f35143da58e57bcbd7008ef15b61f0" Mar 11 09:54:13 crc kubenswrapper[4830]: I0311 09:54:13.552703 4830 generic.go:334] "Generic (PLEG): container finished" podID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerID="be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8" exitCode=0 Mar 11 09:54:13 crc kubenswrapper[4830]: I0311 09:54:13.552794 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qrdh" event={"ID":"b7b791f3-4c95-42e3-802d-7c12b1e20f74","Type":"ContainerDied","Data":"be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8"} Mar 11 09:54:13 crc kubenswrapper[4830]: I0311 09:54:13.932640 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:54:13 crc kubenswrapper[4830]: E0311 09:54:13.933050 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:54:15 crc kubenswrapper[4830]: I0311 09:54:15.571743 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qrdh" event={"ID":"b7b791f3-4c95-42e3-802d-7c12b1e20f74","Type":"ContainerStarted","Data":"6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043"} Mar 11 09:54:15 crc kubenswrapper[4830]: I0311 09:54:15.599471 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6qrdh" podStartSLOduration=2.304509424 podStartE2EDuration="5.599455958s" podCreationTimestamp="2026-03-11 09:54:10 +0000 UTC" firstStartedPulling="2026-03-11 09:54:11.481417843 +0000 UTC m=+2419.262568532" lastFinishedPulling="2026-03-11 09:54:14.776364357 +0000 UTC m=+2422.557515066" observedRunningTime="2026-03-11 09:54:15.597452814 +0000 UTC m=+2423.378603513" watchObservedRunningTime="2026-03-11 09:54:15.599455958 +0000 UTC m=+2423.380606647" Mar 11 09:54:20 crc kubenswrapper[4830]: I0311 09:54:20.356220 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:20 crc kubenswrapper[4830]: I0311 09:54:20.356664 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:20 crc kubenswrapper[4830]: I0311 09:54:20.405756 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:20 crc kubenswrapper[4830]: I0311 09:54:20.681653 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:22 crc kubenswrapper[4830]: I0311 09:54:22.411925 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qrdh"] Mar 11 09:54:22 crc kubenswrapper[4830]: I0311 09:54:22.651403 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6qrdh" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="registry-server" containerID="cri-o://6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043" gracePeriod=2 Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.093726 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.135164 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-utilities\") pod \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.135518 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-catalog-content\") pod \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.135668 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxks\" (UniqueName: \"kubernetes.io/projected/b7b791f3-4c95-42e3-802d-7c12b1e20f74-kube-api-access-9sxks\") pod \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\" (UID: \"b7b791f3-4c95-42e3-802d-7c12b1e20f74\") " Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.135922 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-utilities" (OuterVolumeSpecName: "utilities") pod "b7b791f3-4c95-42e3-802d-7c12b1e20f74" (UID: "b7b791f3-4c95-42e3-802d-7c12b1e20f74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.136196 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.142229 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b791f3-4c95-42e3-802d-7c12b1e20f74-kube-api-access-9sxks" (OuterVolumeSpecName: "kube-api-access-9sxks") pod "b7b791f3-4c95-42e3-802d-7c12b1e20f74" (UID: "b7b791f3-4c95-42e3-802d-7c12b1e20f74"). InnerVolumeSpecName "kube-api-access-9sxks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.200133 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7b791f3-4c95-42e3-802d-7c12b1e20f74" (UID: "b7b791f3-4c95-42e3-802d-7c12b1e20f74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.237465 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7b791f3-4c95-42e3-802d-7c12b1e20f74-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.237508 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxks\" (UniqueName: \"kubernetes.io/projected/b7b791f3-4c95-42e3-802d-7c12b1e20f74-kube-api-access-9sxks\") on node \"crc\" DevicePath \"\"" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.660656 4830 generic.go:334] "Generic (PLEG): container finished" podID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerID="6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043" exitCode=0 Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.660695 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qrdh" event={"ID":"b7b791f3-4c95-42e3-802d-7c12b1e20f74","Type":"ContainerDied","Data":"6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043"} Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.660748 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qrdh" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.660770 4830 scope.go:117] "RemoveContainer" containerID="6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.660755 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qrdh" event={"ID":"b7b791f3-4c95-42e3-802d-7c12b1e20f74","Type":"ContainerDied","Data":"502a438f762d7d4d01ee92ff64d9ed8fc17480c24d1c3f43d7811fbcb52e3cab"} Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.683288 4830 scope.go:117] "RemoveContainer" containerID="be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.696212 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qrdh"] Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.703226 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6qrdh"] Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.718840 4830 scope.go:117] "RemoveContainer" containerID="d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.754165 4830 scope.go:117] "RemoveContainer" containerID="6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043" Mar 11 09:54:23 crc kubenswrapper[4830]: E0311 09:54:23.754619 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043\": container with ID starting with 6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043 not found: ID does not exist" containerID="6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.754672 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043"} err="failed to get container status \"6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043\": rpc error: code = NotFound desc = could not find container \"6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043\": container with ID starting with 6a5ebe61c2279159a3af545e6bd99b8c640c3c632fd6c301f64cb404f2f1d043 not found: ID does not exist" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.754700 4830 scope.go:117] "RemoveContainer" containerID="be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8" Mar 11 09:54:23 crc kubenswrapper[4830]: E0311 09:54:23.755201 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8\": container with ID starting with be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8 not found: ID does not exist" containerID="be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.755260 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8"} err="failed to get container status \"be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8\": rpc error: code = NotFound desc = could not find container \"be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8\": container with ID starting with be96993c615ec24cf1039c239be3873df902aff12997a7ec99f76ae7014521b8 not found: ID does not exist" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.755294 4830 scope.go:117] "RemoveContainer" containerID="d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4" Mar 11 09:54:23 crc kubenswrapper[4830]: E0311 09:54:23.755771 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4\": container with ID starting with d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4 not found: ID does not exist" containerID="d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4" Mar 11 09:54:23 crc kubenswrapper[4830]: I0311 09:54:23.755803 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4"} err="failed to get container status \"d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4\": rpc error: code = NotFound desc = could not find container \"d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4\": container with ID starting with d42d31f2292082b3e0774c41de24de042b0c2accbb16e5671d3de6eb7358cbe4 not found: ID does not exist" Mar 11 09:54:24 crc kubenswrapper[4830]: I0311 09:54:24.933004 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:54:24 crc kubenswrapper[4830]: E0311 09:54:24.934030 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:54:24 crc kubenswrapper[4830]: I0311 09:54:24.943772 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" path="/var/lib/kubelet/pods/b7b791f3-4c95-42e3-802d-7c12b1e20f74/volumes" Mar 11 09:54:38 crc kubenswrapper[4830]: I0311 09:54:38.935824 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:54:38 crc kubenswrapper[4830]: E0311 09:54:38.936653 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:54:49 crc kubenswrapper[4830]: I0311 09:54:49.933421 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:54:49 crc kubenswrapper[4830]: E0311 09:54:49.934222 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:55:03 crc kubenswrapper[4830]: I0311 09:55:03.932531 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:55:03 crc kubenswrapper[4830]: E0311 09:55:03.933431 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:55:18 crc kubenswrapper[4830]: I0311 09:55:18.932417 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:55:18 crc kubenswrapper[4830]: E0311 09:55:18.933463 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:55:31 crc kubenswrapper[4830]: I0311 09:55:31.933186 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:55:31 crc kubenswrapper[4830]: E0311 09:55:31.934492 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:55:46 crc kubenswrapper[4830]: I0311 09:55:46.932726 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:55:46 crc kubenswrapper[4830]: E0311 09:55:46.933622 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:55:58 crc kubenswrapper[4830]: I0311 09:55:58.933145 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:55:58 crc kubenswrapper[4830]: E0311 09:55:58.934039 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.147312 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553716-95pb2"] Mar 11 09:56:00 crc kubenswrapper[4830]: E0311 09:56:00.148086 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="extract-utilities" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.148108 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="extract-utilities" Mar 11 09:56:00 crc kubenswrapper[4830]: E0311 09:56:00.148128 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="extract-content" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.148136 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="extract-content" Mar 11 09:56:00 crc kubenswrapper[4830]: E0311 09:56:00.148156 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="registry-server" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.148166 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="registry-server" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.148415 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b791f3-4c95-42e3-802d-7c12b1e20f74" containerName="registry-server" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.149257 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553716-95pb2" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.151452 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.151668 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.151687 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.169685 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553716-95pb2"] Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.198662 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzpld\" (UniqueName: \"kubernetes.io/projected/5844a434-4c88-45cb-83a1-8c82552f6fe0-kube-api-access-xzpld\") pod \"auto-csr-approver-29553716-95pb2\" (UID: \"5844a434-4c88-45cb-83a1-8c82552f6fe0\") " pod="openshift-infra/auto-csr-approver-29553716-95pb2" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.301151 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzpld\" (UniqueName: \"kubernetes.io/projected/5844a434-4c88-45cb-83a1-8c82552f6fe0-kube-api-access-xzpld\") pod \"auto-csr-approver-29553716-95pb2\" (UID: \"5844a434-4c88-45cb-83a1-8c82552f6fe0\") " pod="openshift-infra/auto-csr-approver-29553716-95pb2" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.331795 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzpld\" (UniqueName: \"kubernetes.io/projected/5844a434-4c88-45cb-83a1-8c82552f6fe0-kube-api-access-xzpld\") pod \"auto-csr-approver-29553716-95pb2\" (UID: \"5844a434-4c88-45cb-83a1-8c82552f6fe0\") " pod="openshift-infra/auto-csr-approver-29553716-95pb2" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.467235 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553716-95pb2" Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.922763 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 09:56:00 crc kubenswrapper[4830]: I0311 09:56:00.925407 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553716-95pb2"] Mar 11 09:56:01 crc kubenswrapper[4830]: I0311 09:56:01.475564 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553716-95pb2" event={"ID":"5844a434-4c88-45cb-83a1-8c82552f6fe0","Type":"ContainerStarted","Data":"7111169a3f6c352038cad06c7a82d9ce27e09f9acbd6003143cae3f481d13d68"} Mar 11 09:56:03 crc kubenswrapper[4830]: I0311 09:56:03.531428 4830 generic.go:334] "Generic (PLEG): container finished" podID="5844a434-4c88-45cb-83a1-8c82552f6fe0" containerID="d6da0d0af3409c3fe80c4cef47396543558c833dc514a779d911e188e53f83a2" exitCode=0 Mar 11 09:56:03 crc kubenswrapper[4830]: I0311 09:56:03.531489 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553716-95pb2" event={"ID":"5844a434-4c88-45cb-83a1-8c82552f6fe0","Type":"ContainerDied","Data":"d6da0d0af3409c3fe80c4cef47396543558c833dc514a779d911e188e53f83a2"} Mar 11 09:56:04 crc kubenswrapper[4830]: I0311 09:56:04.548239 4830 generic.go:334] "Generic (PLEG): container finished" podID="df44fa5f-956c-47f8-af60-49a95e1c6da1" containerID="c553316dff92f7b67da6bed0b2418d74941101ecfd7f0babf17683d36846be94" exitCode=0 Mar 11 09:56:04 crc kubenswrapper[4830]: I0311 09:56:04.548348 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" event={"ID":"df44fa5f-956c-47f8-af60-49a95e1c6da1","Type":"ContainerDied","Data":"c553316dff92f7b67da6bed0b2418d74941101ecfd7f0babf17683d36846be94"} Mar 11 09:56:04 crc kubenswrapper[4830]: I0311 09:56:04.900846 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553716-95pb2" Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.093079 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzpld\" (UniqueName: \"kubernetes.io/projected/5844a434-4c88-45cb-83a1-8c82552f6fe0-kube-api-access-xzpld\") pod \"5844a434-4c88-45cb-83a1-8c82552f6fe0\" (UID: \"5844a434-4c88-45cb-83a1-8c82552f6fe0\") " Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.099312 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5844a434-4c88-45cb-83a1-8c82552f6fe0-kube-api-access-xzpld" (OuterVolumeSpecName: "kube-api-access-xzpld") pod "5844a434-4c88-45cb-83a1-8c82552f6fe0" (UID: "5844a434-4c88-45cb-83a1-8c82552f6fe0"). InnerVolumeSpecName "kube-api-access-xzpld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.196659 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzpld\" (UniqueName: \"kubernetes.io/projected/5844a434-4c88-45cb-83a1-8c82552f6fe0-kube-api-access-xzpld\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.560855 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553716-95pb2" event={"ID":"5844a434-4c88-45cb-83a1-8c82552f6fe0","Type":"ContainerDied","Data":"7111169a3f6c352038cad06c7a82d9ce27e09f9acbd6003143cae3f481d13d68"} Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.560917 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7111169a3f6c352038cad06c7a82d9ce27e09f9acbd6003143cae3f481d13d68" Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.563896 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553716-95pb2" Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.984328 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553710-wzsnn"] Mar 11 09:56:05 crc kubenswrapper[4830]: I0311 09:56:05.998731 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553710-wzsnn"] Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.020729 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118165 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-extra-config-0\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118235 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5gwd\" (UniqueName: \"kubernetes.io/projected/df44fa5f-956c-47f8-af60-49a95e1c6da1-kube-api-access-n5gwd\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118269 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-3\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118321 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-ssh-key-openstack-edpm-ipam\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118346 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-1\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118377 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-0\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118419 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-combined-ca-bundle\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118479 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-inventory\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118503 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-2\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118543 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-1\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.118618 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-0\") pod \"df44fa5f-956c-47f8-af60-49a95e1c6da1\" (UID: \"df44fa5f-956c-47f8-af60-49a95e1c6da1\") " Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.124262 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.127377 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df44fa5f-956c-47f8-af60-49a95e1c6da1-kube-api-access-n5gwd" (OuterVolumeSpecName: "kube-api-access-n5gwd") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "kube-api-access-n5gwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.152044 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.157060 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.157436 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.160502 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.161992 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.163275 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-inventory" (OuterVolumeSpecName: "inventory") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.165906 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.173377 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.185322 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "df44fa5f-956c-47f8-af60-49a95e1c6da1" (UID: "df44fa5f-956c-47f8-af60-49a95e1c6da1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221752 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221832 4830 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221848 4830 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221862 4830 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221878 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221893 4830 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221908 4830 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221922 4830 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221935 4830 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221948 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5gwd\" (UniqueName: \"kubernetes.io/projected/df44fa5f-956c-47f8-af60-49a95e1c6da1-kube-api-access-n5gwd\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.221961 4830 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/df44fa5f-956c-47f8-af60-49a95e1c6da1-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.573327 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" event={"ID":"df44fa5f-956c-47f8-af60-49a95e1c6da1","Type":"ContainerDied","Data":"4fd8b314bd9604f8fd07e1989c41f1f247d0f8847843acf8af7d22ee3b429be1"} Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.573384 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fd8b314bd9604f8fd07e1989c41f1f247d0f8847843acf8af7d22ee3b429be1" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.573386 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-79mrm" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.666171 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4"] Mar 11 09:56:06 crc kubenswrapper[4830]: E0311 09:56:06.666834 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df44fa5f-956c-47f8-af60-49a95e1c6da1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.666850 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="df44fa5f-956c-47f8-af60-49a95e1c6da1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 11 09:56:06 crc kubenswrapper[4830]: E0311 09:56:06.666905 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5844a434-4c88-45cb-83a1-8c82552f6fe0" containerName="oc" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.666911 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="5844a434-4c88-45cb-83a1-8c82552f6fe0" containerName="oc" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.667093 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="df44fa5f-956c-47f8-af60-49a95e1c6da1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.667131 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="5844a434-4c88-45cb-83a1-8c82552f6fe0" containerName="oc" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.667741 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.671831 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.671864 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.672087 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-6rcc4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.672139 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.672336 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.679955 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4"] Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.731074 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.731141 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.731204 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.731252 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.731334 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.731383 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.731403 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdhf2\" (UniqueName: \"kubernetes.io/projected/d6daac1f-f36f-42a1-9735-1b182e03052e-kube-api-access-wdhf2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.832723 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.832813 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.832895 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.832952 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.832976 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdhf2\" (UniqueName: \"kubernetes.io/projected/d6daac1f-f36f-42a1-9735-1b182e03052e-kube-api-access-wdhf2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.833053 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.833106 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.837731 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.837740 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.838049 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.838236 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.838510 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.839364 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.848702 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdhf2\" (UniqueName: \"kubernetes.io/projected/d6daac1f-f36f-42a1-9735-1b182e03052e-kube-api-access-wdhf2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.945399 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777ba606-ef62-40e8-8c74-59538066c64f" path="/var/lib/kubelet/pods/777ba606-ef62-40e8-8c74-59538066c64f/volumes" Mar 11 09:56:06 crc kubenswrapper[4830]: I0311 09:56:06.984946 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:56:07 crc kubenswrapper[4830]: I0311 09:56:07.489974 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4"] Mar 11 09:56:07 crc kubenswrapper[4830]: I0311 09:56:07.583271 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" event={"ID":"d6daac1f-f36f-42a1-9735-1b182e03052e","Type":"ContainerStarted","Data":"5bc2b8b02f355fdf5585b9a10227889f11b9651892c9f1e08dcd5178878f33dc"} Mar 11 09:56:08 crc kubenswrapper[4830]: I0311 09:56:08.591792 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" event={"ID":"d6daac1f-f36f-42a1-9735-1b182e03052e","Type":"ContainerStarted","Data":"db4abbbc669c51a9b5c4779587f786ca4b8b6711a67608608e22662e6d06f752"} Mar 11 09:56:09 crc kubenswrapper[4830]: I0311 09:56:09.933302 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:56:09 crc kubenswrapper[4830]: E0311 09:56:09.934300 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:56:12 crc kubenswrapper[4830]: I0311 09:56:12.624373 4830 scope.go:117] "RemoveContainer" containerID="3bd92cd6e9689468af8f75daedb522e0361f0613a25b26cbe5a0c8b3afb6223d" Mar 11 09:56:23 crc kubenswrapper[4830]: I0311 09:56:23.933108 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:56:23 crc kubenswrapper[4830]: E0311 09:56:23.934057 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:56:38 crc kubenswrapper[4830]: I0311 09:56:38.934091 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:56:38 crc kubenswrapper[4830]: E0311 09:56:38.934864 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 09:56:51 crc kubenswrapper[4830]: I0311 09:56:51.932914 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 09:56:53 crc kubenswrapper[4830]: I0311 09:56:53.017516 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"65b35d2c5b201abecd063ae3cba9aa95e395340f3759873b180ee4836ac11e36"} Mar 11 09:56:53 crc kubenswrapper[4830]: I0311 09:56:53.059086 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" podStartSLOduration=46.54196141 podStartE2EDuration="47.059044775s" podCreationTimestamp="2026-03-11 09:56:06 +0000 UTC" firstStartedPulling="2026-03-11 09:56:07.495092775 +0000 UTC m=+2535.276243464" lastFinishedPulling="2026-03-11 09:56:08.01217614 +0000 UTC m=+2535.793326829" observedRunningTime="2026-03-11 09:56:08.609808299 +0000 UTC m=+2536.390959008" watchObservedRunningTime="2026-03-11 09:56:53.059044775 +0000 UTC m=+2580.840195474" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.145158 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553718-5wskx"] Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.147230 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553718-5wskx" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.151821 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.151801 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.152216 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.158183 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553718-5wskx"] Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.215858 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsncr\" (UniqueName: \"kubernetes.io/projected/2e97f930-5cd6-46d9-a168-f1a3b4f98936-kube-api-access-nsncr\") pod \"auto-csr-approver-29553718-5wskx\" (UID: \"2e97f930-5cd6-46d9-a168-f1a3b4f98936\") " pod="openshift-infra/auto-csr-approver-29553718-5wskx" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.317124 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsncr\" (UniqueName: \"kubernetes.io/projected/2e97f930-5cd6-46d9-a168-f1a3b4f98936-kube-api-access-nsncr\") pod \"auto-csr-approver-29553718-5wskx\" (UID: \"2e97f930-5cd6-46d9-a168-f1a3b4f98936\") " pod="openshift-infra/auto-csr-approver-29553718-5wskx" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.337975 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsncr\" (UniqueName: \"kubernetes.io/projected/2e97f930-5cd6-46d9-a168-f1a3b4f98936-kube-api-access-nsncr\") pod \"auto-csr-approver-29553718-5wskx\" (UID: \"2e97f930-5cd6-46d9-a168-f1a3b4f98936\") " pod="openshift-infra/auto-csr-approver-29553718-5wskx" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.467667 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553718-5wskx" Mar 11 09:58:00 crc kubenswrapper[4830]: I0311 09:58:00.905922 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553718-5wskx"] Mar 11 09:58:01 crc kubenswrapper[4830]: I0311 09:58:01.594830 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553718-5wskx" event={"ID":"2e97f930-5cd6-46d9-a168-f1a3b4f98936","Type":"ContainerStarted","Data":"173633b7700cab10d0e1e0b2d8928d6d8c17545590895a25d9637bf0fc817e55"} Mar 11 09:58:02 crc kubenswrapper[4830]: I0311 09:58:02.606351 4830 generic.go:334] "Generic (PLEG): container finished" podID="2e97f930-5cd6-46d9-a168-f1a3b4f98936" containerID="90ca592dd5ee7483487dfd489dcde7f0a8aa3ef1a2f95a8b93402fc5f3fb792c" exitCode=0 Mar 11 09:58:02 crc kubenswrapper[4830]: I0311 09:58:02.606409 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553718-5wskx" event={"ID":"2e97f930-5cd6-46d9-a168-f1a3b4f98936","Type":"ContainerDied","Data":"90ca592dd5ee7483487dfd489dcde7f0a8aa3ef1a2f95a8b93402fc5f3fb792c"} Mar 11 09:58:03 crc kubenswrapper[4830]: I0311 09:58:03.967215 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553718-5wskx" Mar 11 09:58:03 crc kubenswrapper[4830]: I0311 09:58:03.986801 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsncr\" (UniqueName: \"kubernetes.io/projected/2e97f930-5cd6-46d9-a168-f1a3b4f98936-kube-api-access-nsncr\") pod \"2e97f930-5cd6-46d9-a168-f1a3b4f98936\" (UID: \"2e97f930-5cd6-46d9-a168-f1a3b4f98936\") " Mar 11 09:58:03 crc kubenswrapper[4830]: I0311 09:58:03.995867 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e97f930-5cd6-46d9-a168-f1a3b4f98936-kube-api-access-nsncr" (OuterVolumeSpecName: "kube-api-access-nsncr") pod "2e97f930-5cd6-46d9-a168-f1a3b4f98936" (UID: "2e97f930-5cd6-46d9-a168-f1a3b4f98936"). InnerVolumeSpecName "kube-api-access-nsncr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:58:04 crc kubenswrapper[4830]: I0311 09:58:04.088930 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsncr\" (UniqueName: \"kubernetes.io/projected/2e97f930-5cd6-46d9-a168-f1a3b4f98936-kube-api-access-nsncr\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:04 crc kubenswrapper[4830]: I0311 09:58:04.624779 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553718-5wskx" event={"ID":"2e97f930-5cd6-46d9-a168-f1a3b4f98936","Type":"ContainerDied","Data":"173633b7700cab10d0e1e0b2d8928d6d8c17545590895a25d9637bf0fc817e55"} Mar 11 09:58:04 crc kubenswrapper[4830]: I0311 09:58:04.624821 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="173633b7700cab10d0e1e0b2d8928d6d8c17545590895a25d9637bf0fc817e55" Mar 11 09:58:04 crc kubenswrapper[4830]: I0311 09:58:04.624823 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553718-5wskx" Mar 11 09:58:05 crc kubenswrapper[4830]: I0311 09:58:05.038405 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553712-5j9rz"] Mar 11 09:58:05 crc kubenswrapper[4830]: I0311 09:58:05.048833 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553712-5j9rz"] Mar 11 09:58:06 crc kubenswrapper[4830]: I0311 09:58:06.943391 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d876e5-5e9a-49d5-8bf4-306593bcb686" path="/var/lib/kubelet/pods/37d876e5-5e9a-49d5-8bf4-306593bcb686/volumes" Mar 11 09:58:12 crc kubenswrapper[4830]: I0311 09:58:12.711701 4830 scope.go:117] "RemoveContainer" containerID="293c557ec8fcb06fcacad5f57473e6648a62082d73fbb9933a969f1227b135fe" Mar 11 09:58:23 crc kubenswrapper[4830]: I0311 09:58:23.807609 4830 generic.go:334] "Generic (PLEG): container finished" podID="d6daac1f-f36f-42a1-9735-1b182e03052e" containerID="db4abbbc669c51a9b5c4779587f786ca4b8b6711a67608608e22662e6d06f752" exitCode=0 Mar 11 09:58:23 crc kubenswrapper[4830]: I0311 09:58:23.807751 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" event={"ID":"d6daac1f-f36f-42a1-9735-1b182e03052e","Type":"ContainerDied","Data":"db4abbbc669c51a9b5c4779587f786ca4b8b6711a67608608e22662e6d06f752"} Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.249238 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.308199 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-2\") pod \"d6daac1f-f36f-42a1-9735-1b182e03052e\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.308308 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-0\") pod \"d6daac1f-f36f-42a1-9735-1b182e03052e\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.308524 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-1\") pod \"d6daac1f-f36f-42a1-9735-1b182e03052e\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.308665 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-telemetry-combined-ca-bundle\") pod \"d6daac1f-f36f-42a1-9735-1b182e03052e\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.308705 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ssh-key-openstack-edpm-ipam\") pod \"d6daac1f-f36f-42a1-9735-1b182e03052e\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.308769 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-inventory\") pod \"d6daac1f-f36f-42a1-9735-1b182e03052e\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.308797 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdhf2\" (UniqueName: \"kubernetes.io/projected/d6daac1f-f36f-42a1-9735-1b182e03052e-kube-api-access-wdhf2\") pod \"d6daac1f-f36f-42a1-9735-1b182e03052e\" (UID: \"d6daac1f-f36f-42a1-9735-1b182e03052e\") " Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.316910 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d6daac1f-f36f-42a1-9735-1b182e03052e" (UID: "d6daac1f-f36f-42a1-9735-1b182e03052e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.323194 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6daac1f-f36f-42a1-9735-1b182e03052e-kube-api-access-wdhf2" (OuterVolumeSpecName: "kube-api-access-wdhf2") pod "d6daac1f-f36f-42a1-9735-1b182e03052e" (UID: "d6daac1f-f36f-42a1-9735-1b182e03052e"). InnerVolumeSpecName "kube-api-access-wdhf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.341077 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "d6daac1f-f36f-42a1-9735-1b182e03052e" (UID: "d6daac1f-f36f-42a1-9735-1b182e03052e"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.341671 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "d6daac1f-f36f-42a1-9735-1b182e03052e" (UID: "d6daac1f-f36f-42a1-9735-1b182e03052e"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.343422 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d6daac1f-f36f-42a1-9735-1b182e03052e" (UID: "d6daac1f-f36f-42a1-9735-1b182e03052e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.346238 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "d6daac1f-f36f-42a1-9735-1b182e03052e" (UID: "d6daac1f-f36f-42a1-9735-1b182e03052e"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.358151 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-inventory" (OuterVolumeSpecName: "inventory") pod "d6daac1f-f36f-42a1-9735-1b182e03052e" (UID: "d6daac1f-f36f-42a1-9735-1b182e03052e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.411705 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.411746 4830 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.411761 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.411776 4830 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-inventory\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.411787 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdhf2\" (UniqueName: \"kubernetes.io/projected/d6daac1f-f36f-42a1-9735-1b182e03052e-kube-api-access-wdhf2\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.411798 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.411809 4830 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/d6daac1f-f36f-42a1-9735-1b182e03052e-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.828765 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" event={"ID":"d6daac1f-f36f-42a1-9735-1b182e03052e","Type":"ContainerDied","Data":"5bc2b8b02f355fdf5585b9a10227889f11b9651892c9f1e08dcd5178878f33dc"} Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.828815 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bc2b8b02f355fdf5585b9a10227889f11b9651892c9f1e08dcd5178878f33dc" Mar 11 09:58:25 crc kubenswrapper[4830]: I0311 09:58:25.828853 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4" Mar 11 09:59:13 crc kubenswrapper[4830]: I0311 09:59:13.060709 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:59:13 crc kubenswrapper[4830]: I0311 09:59:13.061223 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.493822 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 11 09:59:16 crc kubenswrapper[4830]: E0311 09:59:16.495158 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e97f930-5cd6-46d9-a168-f1a3b4f98936" containerName="oc" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.495183 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e97f930-5cd6-46d9-a168-f1a3b4f98936" containerName="oc" Mar 11 09:59:16 crc kubenswrapper[4830]: E0311 09:59:16.495208 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6daac1f-f36f-42a1-9735-1b182e03052e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.495222 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6daac1f-f36f-42a1-9735-1b182e03052e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.495470 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e97f930-5cd6-46d9-a168-f1a3b4f98936" containerName="oc" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.495507 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6daac1f-f36f-42a1-9735-1b182e03052e" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.496496 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.499056 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.499153 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.499153 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c5jjz" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.499492 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.504384 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605116 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2vl\" (UniqueName: \"kubernetes.io/projected/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-kube-api-access-mq2vl\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605182 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605214 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605235 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605282 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605303 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605325 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-config-data\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605344 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.605372 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707623 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2vl\" (UniqueName: \"kubernetes.io/projected/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-kube-api-access-mq2vl\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707694 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707725 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707748 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707797 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707816 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707836 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-config-data\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707852 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.707882 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.708321 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.708563 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.709182 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.709907 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.710871 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-config-data\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.717533 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.727227 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.727605 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.728406 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2vl\" (UniqueName: \"kubernetes.io/projected/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-kube-api-access-mq2vl\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.744209 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " pod="openstack/tempest-tests-tempest" Mar 11 09:59:16 crc kubenswrapper[4830]: I0311 09:59:16.821427 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 11 09:59:17 crc kubenswrapper[4830]: I0311 09:59:17.271650 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 11 09:59:17 crc kubenswrapper[4830]: I0311 09:59:17.332090 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3","Type":"ContainerStarted","Data":"470b1c13ad588d9c089141c5faf08022da684564d2105ff76b7147bfa5cbe7a7"} Mar 11 09:59:43 crc kubenswrapper[4830]: I0311 09:59:43.060648 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 09:59:43 crc kubenswrapper[4830]: I0311 09:59:43.061155 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 09:59:44 crc kubenswrapper[4830]: E0311 09:59:44.593435 4830 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 11 09:59:44 crc kubenswrapper[4830]: E0311 09:59:44.594702 4830 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mq2vl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 11 09:59:44 crc kubenswrapper[4830]: E0311 09:59:44.596030 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" Mar 11 09:59:45 crc kubenswrapper[4830]: E0311 09:59:45.599118 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" Mar 11 09:59:59 crc kubenswrapper[4830]: I0311 09:59:59.757088 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.161056 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw"] Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.162838 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.166231 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.166546 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.176260 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553720-5dnqn"] Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.180427 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.182622 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.182870 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.186279 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.196218 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw"] Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.206488 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553720-5dnqn"] Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.319459 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068707e-8c4e-4602-bb05-fba55fe87d05-config-volume\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.319591 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpqqx\" (UniqueName: \"kubernetes.io/projected/7f1fe71e-7b7f-439e-9d8f-68743095b1a7-kube-api-access-dpqqx\") pod \"auto-csr-approver-29553720-5dnqn\" (UID: \"7f1fe71e-7b7f-439e-9d8f-68743095b1a7\") " pod="openshift-infra/auto-csr-approver-29553720-5dnqn" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.319619 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjwpq\" (UniqueName: \"kubernetes.io/projected/f068707e-8c4e-4602-bb05-fba55fe87d05-kube-api-access-vjwpq\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.319674 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068707e-8c4e-4602-bb05-fba55fe87d05-secret-volume\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.421914 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpqqx\" (UniqueName: \"kubernetes.io/projected/7f1fe71e-7b7f-439e-9d8f-68743095b1a7-kube-api-access-dpqqx\") pod \"auto-csr-approver-29553720-5dnqn\" (UID: \"7f1fe71e-7b7f-439e-9d8f-68743095b1a7\") " pod="openshift-infra/auto-csr-approver-29553720-5dnqn" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.421983 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjwpq\" (UniqueName: \"kubernetes.io/projected/f068707e-8c4e-4602-bb05-fba55fe87d05-kube-api-access-vjwpq\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.422095 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068707e-8c4e-4602-bb05-fba55fe87d05-secret-volume\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.422217 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068707e-8c4e-4602-bb05-fba55fe87d05-config-volume\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.423282 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068707e-8c4e-4602-bb05-fba55fe87d05-config-volume\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.438235 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068707e-8c4e-4602-bb05-fba55fe87d05-secret-volume\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.441386 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjwpq\" (UniqueName: \"kubernetes.io/projected/f068707e-8c4e-4602-bb05-fba55fe87d05-kube-api-access-vjwpq\") pod \"collect-profiles-29553720-fwgjw\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.466134 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpqqx\" (UniqueName: \"kubernetes.io/projected/7f1fe71e-7b7f-439e-9d8f-68743095b1a7-kube-api-access-dpqqx\") pod \"auto-csr-approver-29553720-5dnqn\" (UID: \"7f1fe71e-7b7f-439e-9d8f-68743095b1a7\") " pod="openshift-infra/auto-csr-approver-29553720-5dnqn" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.492635 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.509804 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" Mar 11 10:00:00 crc kubenswrapper[4830]: W0311 10:00:00.966873 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf068707e_8c4e_4602_bb05_fba55fe87d05.slice/crio-129ed52d01e9b4cbe09ddff732b1f711c87396eb161ab3c5d04477458f4c2fc0 WatchSource:0}: Error finding container 129ed52d01e9b4cbe09ddff732b1f711c87396eb161ab3c5d04477458f4c2fc0: Status 404 returned error can't find the container with id 129ed52d01e9b4cbe09ddff732b1f711c87396eb161ab3c5d04477458f4c2fc0 Mar 11 10:00:00 crc kubenswrapper[4830]: I0311 10:00:00.967040 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw"] Mar 11 10:00:01 crc kubenswrapper[4830]: I0311 10:00:01.033267 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553720-5dnqn"] Mar 11 10:00:01 crc kubenswrapper[4830]: W0311 10:00:01.038930 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f1fe71e_7b7f_439e_9d8f_68743095b1a7.slice/crio-2ad9045ca402779aad377fe9696997fedfdda1f34feb435c3d90067897193077 WatchSource:0}: Error finding container 2ad9045ca402779aad377fe9696997fedfdda1f34feb435c3d90067897193077: Status 404 returned error can't find the container with id 2ad9045ca402779aad377fe9696997fedfdda1f34feb435c3d90067897193077 Mar 11 10:00:01 crc kubenswrapper[4830]: I0311 10:00:01.735841 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" event={"ID":"7f1fe71e-7b7f-439e-9d8f-68743095b1a7","Type":"ContainerStarted","Data":"2ad9045ca402779aad377fe9696997fedfdda1f34feb435c3d90067897193077"} Mar 11 10:00:01 crc kubenswrapper[4830]: I0311 10:00:01.737689 4830 generic.go:334] "Generic (PLEG): container finished" podID="f068707e-8c4e-4602-bb05-fba55fe87d05" containerID="e9659efc5912d439267033347a83d547ae5fa3876a5c80a484031d63416b0f0c" exitCode=0 Mar 11 10:00:01 crc kubenswrapper[4830]: I0311 10:00:01.737736 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" event={"ID":"f068707e-8c4e-4602-bb05-fba55fe87d05","Type":"ContainerDied","Data":"e9659efc5912d439267033347a83d547ae5fa3876a5c80a484031d63416b0f0c"} Mar 11 10:00:01 crc kubenswrapper[4830]: I0311 10:00:01.737771 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" event={"ID":"f068707e-8c4e-4602-bb05-fba55fe87d05","Type":"ContainerStarted","Data":"129ed52d01e9b4cbe09ddff732b1f711c87396eb161ab3c5d04477458f4c2fc0"} Mar 11 10:00:01 crc kubenswrapper[4830]: I0311 10:00:01.739572 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3","Type":"ContainerStarted","Data":"89e020fe3c13a5fb6b5d0340eb81add04cdc9a53e94a5dc09f419fd5141d94e7"} Mar 11 10:00:01 crc kubenswrapper[4830]: I0311 10:00:01.780582 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.307707482 podStartE2EDuration="46.780564554s" podCreationTimestamp="2026-03-11 09:59:15 +0000 UTC" firstStartedPulling="2026-03-11 09:59:17.281564796 +0000 UTC m=+2725.062715485" lastFinishedPulling="2026-03-11 09:59:59.754421868 +0000 UTC m=+2767.535572557" observedRunningTime="2026-03-11 10:00:01.771655691 +0000 UTC m=+2769.552806400" watchObservedRunningTime="2026-03-11 10:00:01.780564554 +0000 UTC m=+2769.561715243" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.102455 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.178742 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068707e-8c4e-4602-bb05-fba55fe87d05-config-volume\") pod \"f068707e-8c4e-4602-bb05-fba55fe87d05\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.178896 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjwpq\" (UniqueName: \"kubernetes.io/projected/f068707e-8c4e-4602-bb05-fba55fe87d05-kube-api-access-vjwpq\") pod \"f068707e-8c4e-4602-bb05-fba55fe87d05\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.179149 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068707e-8c4e-4602-bb05-fba55fe87d05-secret-volume\") pod \"f068707e-8c4e-4602-bb05-fba55fe87d05\" (UID: \"f068707e-8c4e-4602-bb05-fba55fe87d05\") " Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.179690 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f068707e-8c4e-4602-bb05-fba55fe87d05-config-volume" (OuterVolumeSpecName: "config-volume") pod "f068707e-8c4e-4602-bb05-fba55fe87d05" (UID: "f068707e-8c4e-4602-bb05-fba55fe87d05"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.185539 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f068707e-8c4e-4602-bb05-fba55fe87d05-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f068707e-8c4e-4602-bb05-fba55fe87d05" (UID: "f068707e-8c4e-4602-bb05-fba55fe87d05"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.186096 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f068707e-8c4e-4602-bb05-fba55fe87d05-kube-api-access-vjwpq" (OuterVolumeSpecName: "kube-api-access-vjwpq") pod "f068707e-8c4e-4602-bb05-fba55fe87d05" (UID: "f068707e-8c4e-4602-bb05-fba55fe87d05"). InnerVolumeSpecName "kube-api-access-vjwpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.280981 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjwpq\" (UniqueName: \"kubernetes.io/projected/f068707e-8c4e-4602-bb05-fba55fe87d05-kube-api-access-vjwpq\") on node \"crc\" DevicePath \"\"" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.281025 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f068707e-8c4e-4602-bb05-fba55fe87d05-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.281035 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f068707e-8c4e-4602-bb05-fba55fe87d05-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.763079 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" event={"ID":"f068707e-8c4e-4602-bb05-fba55fe87d05","Type":"ContainerDied","Data":"129ed52d01e9b4cbe09ddff732b1f711c87396eb161ab3c5d04477458f4c2fc0"} Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.763407 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="129ed52d01e9b4cbe09ddff732b1f711c87396eb161ab3c5d04477458f4c2fc0" Mar 11 10:00:03 crc kubenswrapper[4830]: I0311 10:00:03.763545 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553720-fwgjw" Mar 11 10:00:04 crc kubenswrapper[4830]: I0311 10:00:04.188802 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx"] Mar 11 10:00:04 crc kubenswrapper[4830]: I0311 10:00:04.199131 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553675-2mnnx"] Mar 11 10:00:04 crc kubenswrapper[4830]: I0311 10:00:04.774001 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" event={"ID":"7f1fe71e-7b7f-439e-9d8f-68743095b1a7","Type":"ContainerStarted","Data":"b99227ff704a4d01d8d2bac7f81921f727f758b3aa979d157354dff1cc0c27ac"} Mar 11 10:00:04 crc kubenswrapper[4830]: I0311 10:00:04.787321 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" podStartSLOduration=1.336949106 podStartE2EDuration="4.78729871s" podCreationTimestamp="2026-03-11 10:00:00 +0000 UTC" firstStartedPulling="2026-03-11 10:00:01.042524867 +0000 UTC m=+2768.823675556" lastFinishedPulling="2026-03-11 10:00:04.492874481 +0000 UTC m=+2772.274025160" observedRunningTime="2026-03-11 10:00:04.786113539 +0000 UTC m=+2772.567264248" watchObservedRunningTime="2026-03-11 10:00:04.78729871 +0000 UTC m=+2772.568449399" Mar 11 10:00:04 crc kubenswrapper[4830]: I0311 10:00:04.945406 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd0238e-2294-4b23-ab03-88e149c4a0c9" path="/var/lib/kubelet/pods/efd0238e-2294-4b23-ab03-88e149c4a0c9/volumes" Mar 11 10:00:05 crc kubenswrapper[4830]: I0311 10:00:05.783897 4830 generic.go:334] "Generic (PLEG): container finished" podID="7f1fe71e-7b7f-439e-9d8f-68743095b1a7" containerID="b99227ff704a4d01d8d2bac7f81921f727f758b3aa979d157354dff1cc0c27ac" exitCode=0 Mar 11 10:00:05 crc kubenswrapper[4830]: I0311 10:00:05.783942 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" event={"ID":"7f1fe71e-7b7f-439e-9d8f-68743095b1a7","Type":"ContainerDied","Data":"b99227ff704a4d01d8d2bac7f81921f727f758b3aa979d157354dff1cc0c27ac"} Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.113420 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.257272 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpqqx\" (UniqueName: \"kubernetes.io/projected/7f1fe71e-7b7f-439e-9d8f-68743095b1a7-kube-api-access-dpqqx\") pod \"7f1fe71e-7b7f-439e-9d8f-68743095b1a7\" (UID: \"7f1fe71e-7b7f-439e-9d8f-68743095b1a7\") " Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.263271 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1fe71e-7b7f-439e-9d8f-68743095b1a7-kube-api-access-dpqqx" (OuterVolumeSpecName: "kube-api-access-dpqqx") pod "7f1fe71e-7b7f-439e-9d8f-68743095b1a7" (UID: "7f1fe71e-7b7f-439e-9d8f-68743095b1a7"). InnerVolumeSpecName "kube-api-access-dpqqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.359569 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpqqx\" (UniqueName: \"kubernetes.io/projected/7f1fe71e-7b7f-439e-9d8f-68743095b1a7-kube-api-access-dpqqx\") on node \"crc\" DevicePath \"\"" Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.806011 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" event={"ID":"7f1fe71e-7b7f-439e-9d8f-68743095b1a7","Type":"ContainerDied","Data":"2ad9045ca402779aad377fe9696997fedfdda1f34feb435c3d90067897193077"} Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.806067 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad9045ca402779aad377fe9696997fedfdda1f34feb435c3d90067897193077" Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.806443 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553720-5dnqn" Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.851604 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553714-zbghn"] Mar 11 10:00:07 crc kubenswrapper[4830]: I0311 10:00:07.861738 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553714-zbghn"] Mar 11 10:00:08 crc kubenswrapper[4830]: I0311 10:00:08.944358 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="338ffab6-8c72-40aa-b2b6-582c35164f2a" path="/var/lib/kubelet/pods/338ffab6-8c72-40aa-b2b6-582c35164f2a/volumes" Mar 11 10:00:12 crc kubenswrapper[4830]: I0311 10:00:12.802896 4830 scope.go:117] "RemoveContainer" containerID="92720a5af4102d3f119c4a6e93e10dfc968c09a8724c3beba6d5ddd3c34046d5" Mar 11 10:00:12 crc kubenswrapper[4830]: I0311 10:00:12.846983 4830 scope.go:117] "RemoveContainer" containerID="ef91b38f05c9233b18dc904599d163b9c05cf7a5f21254180b50c4a2ffe400b5" Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.060532 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.060960 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.061161 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.062123 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65b35d2c5b201abecd063ae3cba9aa95e395340f3759873b180ee4836ac11e36"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.062314 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://65b35d2c5b201abecd063ae3cba9aa95e395340f3759873b180ee4836ac11e36" gracePeriod=600 Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.865052 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="65b35d2c5b201abecd063ae3cba9aa95e395340f3759873b180ee4836ac11e36" exitCode=0 Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.865142 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"65b35d2c5b201abecd063ae3cba9aa95e395340f3759873b180ee4836ac11e36"} Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.865689 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09"} Mar 11 10:00:13 crc kubenswrapper[4830]: I0311 10:00:13.865720 4830 scope.go:117] "RemoveContainer" containerID="5e4a762821fec3abf61dc22badb7a7692c4c1d38d46823156a5b6836981b3de8" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.688082 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppl6c"] Mar 11 10:00:45 crc kubenswrapper[4830]: E0311 10:00:45.689043 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1fe71e-7b7f-439e-9d8f-68743095b1a7" containerName="oc" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.689062 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1fe71e-7b7f-439e-9d8f-68743095b1a7" containerName="oc" Mar 11 10:00:45 crc kubenswrapper[4830]: E0311 10:00:45.689112 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f068707e-8c4e-4602-bb05-fba55fe87d05" containerName="collect-profiles" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.689142 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="f068707e-8c4e-4602-bb05-fba55fe87d05" containerName="collect-profiles" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.689324 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="f068707e-8c4e-4602-bb05-fba55fe87d05" containerName="collect-profiles" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.689340 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1fe71e-7b7f-439e-9d8f-68743095b1a7" containerName="oc" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.690686 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.710665 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppl6c"] Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.772939 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7bl7\" (UniqueName: \"kubernetes.io/projected/a3cc1d21-68c5-4820-a577-0e773ffc084b-kube-api-access-c7bl7\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.773105 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-utilities\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.773138 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-catalog-content\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.874591 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7bl7\" (UniqueName: \"kubernetes.io/projected/a3cc1d21-68c5-4820-a577-0e773ffc084b-kube-api-access-c7bl7\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.874717 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-utilities\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.874760 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-catalog-content\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.875328 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-catalog-content\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.875508 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-utilities\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:45 crc kubenswrapper[4830]: I0311 10:00:45.900866 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7bl7\" (UniqueName: \"kubernetes.io/projected/a3cc1d21-68c5-4820-a577-0e773ffc084b-kube-api-access-c7bl7\") pod \"redhat-operators-ppl6c\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:46 crc kubenswrapper[4830]: I0311 10:00:46.015048 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:00:46 crc kubenswrapper[4830]: I0311 10:00:46.529578 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppl6c"] Mar 11 10:00:47 crc kubenswrapper[4830]: I0311 10:00:47.189225 4830 generic.go:334] "Generic (PLEG): container finished" podID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerID="46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af" exitCode=0 Mar 11 10:00:47 crc kubenswrapper[4830]: I0311 10:00:47.189286 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppl6c" event={"ID":"a3cc1d21-68c5-4820-a577-0e773ffc084b","Type":"ContainerDied","Data":"46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af"} Mar 11 10:00:47 crc kubenswrapper[4830]: I0311 10:00:47.189513 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppl6c" event={"ID":"a3cc1d21-68c5-4820-a577-0e773ffc084b","Type":"ContainerStarted","Data":"9eb115f6f417b110042ba5bcbebb6fa2f3d61329d615c5f307020de7edde6017"} Mar 11 10:00:49 crc kubenswrapper[4830]: I0311 10:00:49.208201 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppl6c" event={"ID":"a3cc1d21-68c5-4820-a577-0e773ffc084b","Type":"ContainerStarted","Data":"c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066"} Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.741688 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cldf9"] Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.744227 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.761744 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwhw\" (UniqueName: \"kubernetes.io/projected/9e65ad73-b85e-46fd-bf92-533d92bb2c25-kube-api-access-sgwhw\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.761850 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-utilities\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.761975 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-catalog-content\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.769584 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cldf9"] Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.864012 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-catalog-content\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.864183 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwhw\" (UniqueName: \"kubernetes.io/projected/9e65ad73-b85e-46fd-bf92-533d92bb2c25-kube-api-access-sgwhw\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.864270 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-utilities\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.864529 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-catalog-content\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.864742 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-utilities\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:54 crc kubenswrapper[4830]: I0311 10:00:54.885084 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwhw\" (UniqueName: \"kubernetes.io/projected/9e65ad73-b85e-46fd-bf92-533d92bb2c25-kube-api-access-sgwhw\") pod \"redhat-marketplace-cldf9\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:55 crc kubenswrapper[4830]: I0311 10:00:55.065762 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:00:55 crc kubenswrapper[4830]: I0311 10:00:55.544463 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cldf9"] Mar 11 10:00:56 crc kubenswrapper[4830]: I0311 10:00:56.282556 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cldf9" event={"ID":"9e65ad73-b85e-46fd-bf92-533d92bb2c25","Type":"ContainerStarted","Data":"869a3d69064b620ee5aba69246f0e95b22b450485ad2135a4629cab87f4fcaab"} Mar 11 10:00:57 crc kubenswrapper[4830]: I0311 10:00:57.294010 4830 generic.go:334] "Generic (PLEG): container finished" podID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerID="26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658" exitCode=0 Mar 11 10:00:57 crc kubenswrapper[4830]: I0311 10:00:57.294088 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cldf9" event={"ID":"9e65ad73-b85e-46fd-bf92-533d92bb2c25","Type":"ContainerDied","Data":"26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658"} Mar 11 10:00:57 crc kubenswrapper[4830]: I0311 10:00:57.299008 4830 generic.go:334] "Generic (PLEG): container finished" podID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerID="c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066" exitCode=0 Mar 11 10:00:57 crc kubenswrapper[4830]: I0311 10:00:57.299079 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppl6c" event={"ID":"a3cc1d21-68c5-4820-a577-0e773ffc084b","Type":"ContainerDied","Data":"c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066"} Mar 11 10:00:58 crc kubenswrapper[4830]: I0311 10:00:58.310679 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppl6c" event={"ID":"a3cc1d21-68c5-4820-a577-0e773ffc084b","Type":"ContainerStarted","Data":"8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d"} Mar 11 10:00:58 crc kubenswrapper[4830]: I0311 10:00:58.312735 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cldf9" event={"ID":"9e65ad73-b85e-46fd-bf92-533d92bb2c25","Type":"ContainerStarted","Data":"e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b"} Mar 11 10:00:58 crc kubenswrapper[4830]: I0311 10:00:58.377767 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppl6c" podStartSLOduration=2.840674051 podStartE2EDuration="13.377736851s" podCreationTimestamp="2026-03-11 10:00:45 +0000 UTC" firstStartedPulling="2026-03-11 10:00:47.191106737 +0000 UTC m=+2814.972257426" lastFinishedPulling="2026-03-11 10:00:57.728169537 +0000 UTC m=+2825.509320226" observedRunningTime="2026-03-11 10:00:58.343616441 +0000 UTC m=+2826.124767150" watchObservedRunningTime="2026-03-11 10:00:58.377736851 +0000 UTC m=+2826.158887540" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.146970 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29553721-rtbwn"] Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.149489 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.165533 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29553721-rtbwn"] Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.274404 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-combined-ca-bundle\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.274544 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-config-data\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.274616 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmqt\" (UniqueName: \"kubernetes.io/projected/04ab5666-8c5d-4e96-9c47-502bdc63bafb-kube-api-access-tcmqt\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.274652 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-fernet-keys\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.376553 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-combined-ca-bundle\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.376660 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-config-data\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.376723 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmqt\" (UniqueName: \"kubernetes.io/projected/04ab5666-8c5d-4e96-9c47-502bdc63bafb-kube-api-access-tcmqt\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.376769 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-fernet-keys\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.385137 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-combined-ca-bundle\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.385146 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-config-data\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.385243 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-fernet-keys\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.396363 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmqt\" (UniqueName: \"kubernetes.io/projected/04ab5666-8c5d-4e96-9c47-502bdc63bafb-kube-api-access-tcmqt\") pod \"keystone-cron-29553721-rtbwn\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.481078 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:00 crc kubenswrapper[4830]: I0311 10:01:00.971699 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29553721-rtbwn"] Mar 11 10:01:00 crc kubenswrapper[4830]: W0311 10:01:00.997039 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ab5666_8c5d_4e96_9c47_502bdc63bafb.slice/crio-5d1cf44bbbb23b565eab0f7d01a6eed45d912d897015008afefe60b9cf0ec46e WatchSource:0}: Error finding container 5d1cf44bbbb23b565eab0f7d01a6eed45d912d897015008afefe60b9cf0ec46e: Status 404 returned error can't find the container with id 5d1cf44bbbb23b565eab0f7d01a6eed45d912d897015008afefe60b9cf0ec46e Mar 11 10:01:01 crc kubenswrapper[4830]: I0311 10:01:01.346465 4830 generic.go:334] "Generic (PLEG): container finished" podID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerID="e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b" exitCode=0 Mar 11 10:01:01 crc kubenswrapper[4830]: I0311 10:01:01.346562 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cldf9" event={"ID":"9e65ad73-b85e-46fd-bf92-533d92bb2c25","Type":"ContainerDied","Data":"e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b"} Mar 11 10:01:01 crc kubenswrapper[4830]: I0311 10:01:01.349261 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 10:01:01 crc kubenswrapper[4830]: I0311 10:01:01.350870 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29553721-rtbwn" event={"ID":"04ab5666-8c5d-4e96-9c47-502bdc63bafb","Type":"ContainerStarted","Data":"2941288d66b061a3b5dc00b2d2541d142ca9c415c70b739b61e80de366f19137"} Mar 11 10:01:01 crc kubenswrapper[4830]: I0311 10:01:01.351025 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29553721-rtbwn" event={"ID":"04ab5666-8c5d-4e96-9c47-502bdc63bafb","Type":"ContainerStarted","Data":"5d1cf44bbbb23b565eab0f7d01a6eed45d912d897015008afefe60b9cf0ec46e"} Mar 11 10:01:01 crc kubenswrapper[4830]: I0311 10:01:01.394104 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29553721-rtbwn" podStartSLOduration=1.394077662 podStartE2EDuration="1.394077662s" podCreationTimestamp="2026-03-11 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 10:01:01.381621902 +0000 UTC m=+2829.162772611" watchObservedRunningTime="2026-03-11 10:01:01.394077662 +0000 UTC m=+2829.175228361" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.364008 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cldf9" event={"ID":"9e65ad73-b85e-46fd-bf92-533d92bb2c25","Type":"ContainerStarted","Data":"4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148"} Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.401084 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cldf9" podStartSLOduration=3.929311772 podStartE2EDuration="8.401062943s" podCreationTimestamp="2026-03-11 10:00:54 +0000 UTC" firstStartedPulling="2026-03-11 10:00:57.296105534 +0000 UTC m=+2825.077256223" lastFinishedPulling="2026-03-11 10:01:01.767856705 +0000 UTC m=+2829.549007394" observedRunningTime="2026-03-11 10:01:02.386376092 +0000 UTC m=+2830.167526791" watchObservedRunningTime="2026-03-11 10:01:02.401062943 +0000 UTC m=+2830.182213632" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.555141 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tk4h5"] Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.557679 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.572775 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tk4h5"] Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.649671 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csnhx\" (UniqueName: \"kubernetes.io/projected/aaca749f-8f01-4e4b-8995-aa703a68324f-kube-api-access-csnhx\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.650057 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-catalog-content\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.650299 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-utilities\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.752458 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-catalog-content\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.752567 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-utilities\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.752679 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csnhx\" (UniqueName: \"kubernetes.io/projected/aaca749f-8f01-4e4b-8995-aa703a68324f-kube-api-access-csnhx\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.753068 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-catalog-content\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.753254 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-utilities\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.771672 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csnhx\" (UniqueName: \"kubernetes.io/projected/aaca749f-8f01-4e4b-8995-aa703a68324f-kube-api-access-csnhx\") pod \"certified-operators-tk4h5\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:02 crc kubenswrapper[4830]: I0311 10:01:02.901916 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:03 crc kubenswrapper[4830]: I0311 10:01:03.449761 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tk4h5"] Mar 11 10:01:04 crc kubenswrapper[4830]: I0311 10:01:04.382847 4830 generic.go:334] "Generic (PLEG): container finished" podID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerID="221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4" exitCode=0 Mar 11 10:01:04 crc kubenswrapper[4830]: I0311 10:01:04.382947 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk4h5" event={"ID":"aaca749f-8f01-4e4b-8995-aa703a68324f","Type":"ContainerDied","Data":"221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4"} Mar 11 10:01:04 crc kubenswrapper[4830]: I0311 10:01:04.383162 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk4h5" event={"ID":"aaca749f-8f01-4e4b-8995-aa703a68324f","Type":"ContainerStarted","Data":"15e801d29000eb371dd9a1256d6c72ab01bcb8c6af6beab809c14305a0a6dd2c"} Mar 11 10:01:04 crc kubenswrapper[4830]: E0311 10:01:04.874875 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ab5666_8c5d_4e96_9c47_502bdc63bafb.slice/crio-conmon-2941288d66b061a3b5dc00b2d2541d142ca9c415c70b739b61e80de366f19137.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ab5666_8c5d_4e96_9c47_502bdc63bafb.slice/crio-2941288d66b061a3b5dc00b2d2541d142ca9c415c70b739b61e80de366f19137.scope\": RecentStats: unable to find data in memory cache]" Mar 11 10:01:05 crc kubenswrapper[4830]: I0311 10:01:05.067395 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:01:05 crc kubenswrapper[4830]: I0311 10:01:05.067447 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:01:05 crc kubenswrapper[4830]: I0311 10:01:05.393554 4830 generic.go:334] "Generic (PLEG): container finished" podID="04ab5666-8c5d-4e96-9c47-502bdc63bafb" containerID="2941288d66b061a3b5dc00b2d2541d142ca9c415c70b739b61e80de366f19137" exitCode=0 Mar 11 10:01:05 crc kubenswrapper[4830]: I0311 10:01:05.393601 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29553721-rtbwn" event={"ID":"04ab5666-8c5d-4e96-9c47-502bdc63bafb","Type":"ContainerDied","Data":"2941288d66b061a3b5dc00b2d2541d142ca9c415c70b739b61e80de366f19137"} Mar 11 10:01:05 crc kubenswrapper[4830]: I0311 10:01:05.395955 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk4h5" event={"ID":"aaca749f-8f01-4e4b-8995-aa703a68324f","Type":"ContainerStarted","Data":"bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df"} Mar 11 10:01:06 crc kubenswrapper[4830]: I0311 10:01:06.016095 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:01:06 crc kubenswrapper[4830]: I0311 10:01:06.016156 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:01:06 crc kubenswrapper[4830]: I0311 10:01:06.123291 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-cldf9" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="registry-server" probeResult="failure" output=< Mar 11 10:01:06 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:01:06 crc kubenswrapper[4830]: > Mar 11 10:01:06 crc kubenswrapper[4830]: I0311 10:01:06.406667 4830 generic.go:334] "Generic (PLEG): container finished" podID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerID="bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df" exitCode=0 Mar 11 10:01:06 crc kubenswrapper[4830]: I0311 10:01:06.406714 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk4h5" event={"ID":"aaca749f-8f01-4e4b-8995-aa703a68324f","Type":"ContainerDied","Data":"bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df"} Mar 11 10:01:06 crc kubenswrapper[4830]: I0311 10:01:06.860570 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.038740 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-combined-ca-bundle\") pod \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.039099 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-fernet-keys\") pod \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.039162 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcmqt\" (UniqueName: \"kubernetes.io/projected/04ab5666-8c5d-4e96-9c47-502bdc63bafb-kube-api-access-tcmqt\") pod \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.039186 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-config-data\") pod \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\" (UID: \"04ab5666-8c5d-4e96-9c47-502bdc63bafb\") " Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.047203 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "04ab5666-8c5d-4e96-9c47-502bdc63bafb" (UID: "04ab5666-8c5d-4e96-9c47-502bdc63bafb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.048440 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ab5666-8c5d-4e96-9c47-502bdc63bafb-kube-api-access-tcmqt" (OuterVolumeSpecName: "kube-api-access-tcmqt") pod "04ab5666-8c5d-4e96-9c47-502bdc63bafb" (UID: "04ab5666-8c5d-4e96-9c47-502bdc63bafb"). InnerVolumeSpecName "kube-api-access-tcmqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.064471 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppl6c" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="registry-server" probeResult="failure" output=< Mar 11 10:01:07 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:01:07 crc kubenswrapper[4830]: > Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.081159 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ab5666-8c5d-4e96-9c47-502bdc63bafb" (UID: "04ab5666-8c5d-4e96-9c47-502bdc63bafb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.111826 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-config-data" (OuterVolumeSpecName: "config-data") pod "04ab5666-8c5d-4e96-9c47-502bdc63bafb" (UID: "04ab5666-8c5d-4e96-9c47-502bdc63bafb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.149347 4830 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.149385 4830 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.149398 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcmqt\" (UniqueName: \"kubernetes.io/projected/04ab5666-8c5d-4e96-9c47-502bdc63bafb-kube-api-access-tcmqt\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.149412 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ab5666-8c5d-4e96-9c47-502bdc63bafb-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.416901 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29553721-rtbwn" event={"ID":"04ab5666-8c5d-4e96-9c47-502bdc63bafb","Type":"ContainerDied","Data":"5d1cf44bbbb23b565eab0f7d01a6eed45d912d897015008afefe60b9cf0ec46e"} Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.417286 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d1cf44bbbb23b565eab0f7d01a6eed45d912d897015008afefe60b9cf0ec46e" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.416928 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29553721-rtbwn" Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.419524 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk4h5" event={"ID":"aaca749f-8f01-4e4b-8995-aa703a68324f","Type":"ContainerStarted","Data":"b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed"} Mar 11 10:01:07 crc kubenswrapper[4830]: I0311 10:01:07.447225 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tk4h5" podStartSLOduration=2.704718818 podStartE2EDuration="5.447202858s" podCreationTimestamp="2026-03-11 10:01:02 +0000 UTC" firstStartedPulling="2026-03-11 10:01:04.384760172 +0000 UTC m=+2832.165910861" lastFinishedPulling="2026-03-11 10:01:07.127244212 +0000 UTC m=+2834.908394901" observedRunningTime="2026-03-11 10:01:07.444091184 +0000 UTC m=+2835.225241923" watchObservedRunningTime="2026-03-11 10:01:07.447202858 +0000 UTC m=+2835.228353557" Mar 11 10:01:12 crc kubenswrapper[4830]: I0311 10:01:12.902447 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:12 crc kubenswrapper[4830]: I0311 10:01:12.903003 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:13 crc kubenswrapper[4830]: I0311 10:01:13.952967 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-tk4h5" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="registry-server" probeResult="failure" output=< Mar 11 10:01:13 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:01:13 crc kubenswrapper[4830]: > Mar 11 10:01:15 crc kubenswrapper[4830]: I0311 10:01:15.111607 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:01:15 crc kubenswrapper[4830]: I0311 10:01:15.167842 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:01:15 crc kubenswrapper[4830]: I0311 10:01:15.352940 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cldf9"] Mar 11 10:01:16 crc kubenswrapper[4830]: I0311 10:01:16.489118 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cldf9" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="registry-server" containerID="cri-o://4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148" gracePeriod=2 Mar 11 10:01:16 crc kubenswrapper[4830]: I0311 10:01:16.982888 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.059802 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppl6c" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="registry-server" probeResult="failure" output=< Mar 11 10:01:17 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:01:17 crc kubenswrapper[4830]: > Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.141506 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-utilities\") pod \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.141635 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwhw\" (UniqueName: \"kubernetes.io/projected/9e65ad73-b85e-46fd-bf92-533d92bb2c25-kube-api-access-sgwhw\") pod \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.141738 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-catalog-content\") pod \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\" (UID: \"9e65ad73-b85e-46fd-bf92-533d92bb2c25\") " Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.142816 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-utilities" (OuterVolumeSpecName: "utilities") pod "9e65ad73-b85e-46fd-bf92-533d92bb2c25" (UID: "9e65ad73-b85e-46fd-bf92-533d92bb2c25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.151217 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e65ad73-b85e-46fd-bf92-533d92bb2c25-kube-api-access-sgwhw" (OuterVolumeSpecName: "kube-api-access-sgwhw") pod "9e65ad73-b85e-46fd-bf92-533d92bb2c25" (UID: "9e65ad73-b85e-46fd-bf92-533d92bb2c25"). InnerVolumeSpecName "kube-api-access-sgwhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.172609 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e65ad73-b85e-46fd-bf92-533d92bb2c25" (UID: "9e65ad73-b85e-46fd-bf92-533d92bb2c25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.244092 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwhw\" (UniqueName: \"kubernetes.io/projected/9e65ad73-b85e-46fd-bf92-533d92bb2c25-kube-api-access-sgwhw\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.244127 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.244137 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e65ad73-b85e-46fd-bf92-533d92bb2c25-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.499513 4830 generic.go:334] "Generic (PLEG): container finished" podID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerID="4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148" exitCode=0 Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.499549 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cldf9" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.499567 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cldf9" event={"ID":"9e65ad73-b85e-46fd-bf92-533d92bb2c25","Type":"ContainerDied","Data":"4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148"} Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.499961 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cldf9" event={"ID":"9e65ad73-b85e-46fd-bf92-533d92bb2c25","Type":"ContainerDied","Data":"869a3d69064b620ee5aba69246f0e95b22b450485ad2135a4629cab87f4fcaab"} Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.499984 4830 scope.go:117] "RemoveContainer" containerID="4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.551556 4830 scope.go:117] "RemoveContainer" containerID="e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.611465 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cldf9"] Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.619464 4830 scope.go:117] "RemoveContainer" containerID="26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.685437 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cldf9"] Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.714862 4830 scope.go:117] "RemoveContainer" containerID="4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148" Mar 11 10:01:17 crc kubenswrapper[4830]: E0311 10:01:17.715324 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148\": container with ID starting with 4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148 not found: ID does not exist" containerID="4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.715354 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148"} err="failed to get container status \"4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148\": rpc error: code = NotFound desc = could not find container \"4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148\": container with ID starting with 4cecb1d2481bc4c096e6779a0905aeb5a21d76837785ae45126b10e37e2fe148 not found: ID does not exist" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.715374 4830 scope.go:117] "RemoveContainer" containerID="e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b" Mar 11 10:01:17 crc kubenswrapper[4830]: E0311 10:01:17.715615 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b\": container with ID starting with e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b not found: ID does not exist" containerID="e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.715638 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b"} err="failed to get container status \"e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b\": rpc error: code = NotFound desc = could not find container \"e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b\": container with ID starting with e6820ce32bbb46acef13e4af404f4734a040c1a27dd9e110e5b4db34ec19104b not found: ID does not exist" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.715653 4830 scope.go:117] "RemoveContainer" containerID="26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658" Mar 11 10:01:17 crc kubenswrapper[4830]: E0311 10:01:17.715936 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658\": container with ID starting with 26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658 not found: ID does not exist" containerID="26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658" Mar 11 10:01:17 crc kubenswrapper[4830]: I0311 10:01:17.715956 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658"} err="failed to get container status \"26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658\": rpc error: code = NotFound desc = could not find container \"26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658\": container with ID starting with 26bd9bc1fb48403ab241bbc32c1be856f6dae678828430a6d40c6bfb32596658 not found: ID does not exist" Mar 11 10:01:18 crc kubenswrapper[4830]: I0311 10:01:18.943302 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" path="/var/lib/kubelet/pods/9e65ad73-b85e-46fd-bf92-533d92bb2c25/volumes" Mar 11 10:01:22 crc kubenswrapper[4830]: I0311 10:01:22.963189 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:23 crc kubenswrapper[4830]: I0311 10:01:23.014729 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:23 crc kubenswrapper[4830]: I0311 10:01:23.202493 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tk4h5"] Mar 11 10:01:24 crc kubenswrapper[4830]: I0311 10:01:24.565436 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tk4h5" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="registry-server" containerID="cri-o://b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed" gracePeriod=2 Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.075543 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.131437 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-utilities\") pod \"aaca749f-8f01-4e4b-8995-aa703a68324f\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.132195 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csnhx\" (UniqueName: \"kubernetes.io/projected/aaca749f-8f01-4e4b-8995-aa703a68324f-kube-api-access-csnhx\") pod \"aaca749f-8f01-4e4b-8995-aa703a68324f\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.132339 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-catalog-content\") pod \"aaca749f-8f01-4e4b-8995-aa703a68324f\" (UID: \"aaca749f-8f01-4e4b-8995-aa703a68324f\") " Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.132406 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-utilities" (OuterVolumeSpecName: "utilities") pod "aaca749f-8f01-4e4b-8995-aa703a68324f" (UID: "aaca749f-8f01-4e4b-8995-aa703a68324f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.133304 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.142431 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaca749f-8f01-4e4b-8995-aa703a68324f-kube-api-access-csnhx" (OuterVolumeSpecName: "kube-api-access-csnhx") pod "aaca749f-8f01-4e4b-8995-aa703a68324f" (UID: "aaca749f-8f01-4e4b-8995-aa703a68324f"). InnerVolumeSpecName "kube-api-access-csnhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.195021 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aaca749f-8f01-4e4b-8995-aa703a68324f" (UID: "aaca749f-8f01-4e4b-8995-aa703a68324f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.234653 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csnhx\" (UniqueName: \"kubernetes.io/projected/aaca749f-8f01-4e4b-8995-aa703a68324f-kube-api-access-csnhx\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.234694 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaca749f-8f01-4e4b-8995-aa703a68324f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.576849 4830 generic.go:334] "Generic (PLEG): container finished" podID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerID="b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed" exitCode=0 Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.576918 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk4h5" event={"ID":"aaca749f-8f01-4e4b-8995-aa703a68324f","Type":"ContainerDied","Data":"b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed"} Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.577285 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk4h5" event={"ID":"aaca749f-8f01-4e4b-8995-aa703a68324f","Type":"ContainerDied","Data":"15e801d29000eb371dd9a1256d6c72ab01bcb8c6af6beab809c14305a0a6dd2c"} Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.577308 4830 scope.go:117] "RemoveContainer" containerID="b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.576956 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk4h5" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.604422 4830 scope.go:117] "RemoveContainer" containerID="bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.616381 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tk4h5"] Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.636939 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tk4h5"] Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.640772 4830 scope.go:117] "RemoveContainer" containerID="221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.685725 4830 scope.go:117] "RemoveContainer" containerID="b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed" Mar 11 10:01:25 crc kubenswrapper[4830]: E0311 10:01:25.686273 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed\": container with ID starting with b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed not found: ID does not exist" containerID="b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.686328 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed"} err="failed to get container status \"b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed\": rpc error: code = NotFound desc = could not find container \"b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed\": container with ID starting with b152a392444695f282c24a1b7b60496f9d76b63ea646a07d4c69913d0d6b0bed not found: ID does not exist" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.686361 4830 scope.go:117] "RemoveContainer" containerID="bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df" Mar 11 10:01:25 crc kubenswrapper[4830]: E0311 10:01:25.686724 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df\": container with ID starting with bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df not found: ID does not exist" containerID="bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.686821 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df"} err="failed to get container status \"bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df\": rpc error: code = NotFound desc = could not find container \"bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df\": container with ID starting with bee1f75713854c99f7953bff9c91adf68353e22430e73cc3554e9e9804aac1df not found: ID does not exist" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.686901 4830 scope.go:117] "RemoveContainer" containerID="221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4" Mar 11 10:01:25 crc kubenswrapper[4830]: E0311 10:01:25.689529 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4\": container with ID starting with 221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4 not found: ID does not exist" containerID="221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4" Mar 11 10:01:25 crc kubenswrapper[4830]: I0311 10:01:25.689559 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4"} err="failed to get container status \"221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4\": rpc error: code = NotFound desc = could not find container \"221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4\": container with ID starting with 221a6c006f8498dad4fc2ec4ff4fe694ea7ac1c8e012883f2bec6033f60b20a4 not found: ID does not exist" Mar 11 10:01:26 crc kubenswrapper[4830]: I0311 10:01:26.944935 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" path="/var/lib/kubelet/pods/aaca749f-8f01-4e4b-8995-aa703a68324f/volumes" Mar 11 10:01:27 crc kubenswrapper[4830]: I0311 10:01:27.066551 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppl6c" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="registry-server" probeResult="failure" output=< Mar 11 10:01:27 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:01:27 crc kubenswrapper[4830]: > Mar 11 10:01:36 crc kubenswrapper[4830]: I0311 10:01:36.076855 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:01:36 crc kubenswrapper[4830]: I0311 10:01:36.130621 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:01:36 crc kubenswrapper[4830]: I0311 10:01:36.317167 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppl6c"] Mar 11 10:01:37 crc kubenswrapper[4830]: I0311 10:01:37.682189 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppl6c" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="registry-server" containerID="cri-o://8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d" gracePeriod=2 Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.301932 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.410140 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-catalog-content\") pod \"a3cc1d21-68c5-4820-a577-0e773ffc084b\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.410342 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-utilities\") pod \"a3cc1d21-68c5-4820-a577-0e773ffc084b\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.410389 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7bl7\" (UniqueName: \"kubernetes.io/projected/a3cc1d21-68c5-4820-a577-0e773ffc084b-kube-api-access-c7bl7\") pod \"a3cc1d21-68c5-4820-a577-0e773ffc084b\" (UID: \"a3cc1d21-68c5-4820-a577-0e773ffc084b\") " Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.411314 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-utilities" (OuterVolumeSpecName: "utilities") pod "a3cc1d21-68c5-4820-a577-0e773ffc084b" (UID: "a3cc1d21-68c5-4820-a577-0e773ffc084b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.417859 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3cc1d21-68c5-4820-a577-0e773ffc084b-kube-api-access-c7bl7" (OuterVolumeSpecName: "kube-api-access-c7bl7") pod "a3cc1d21-68c5-4820-a577-0e773ffc084b" (UID: "a3cc1d21-68c5-4820-a577-0e773ffc084b"). InnerVolumeSpecName "kube-api-access-c7bl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.514365 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.514426 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7bl7\" (UniqueName: \"kubernetes.io/projected/a3cc1d21-68c5-4820-a577-0e773ffc084b-kube-api-access-c7bl7\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.557180 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3cc1d21-68c5-4820-a577-0e773ffc084b" (UID: "a3cc1d21-68c5-4820-a577-0e773ffc084b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.616165 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3cc1d21-68c5-4820-a577-0e773ffc084b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.694700 4830 generic.go:334] "Generic (PLEG): container finished" podID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerID="8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d" exitCode=0 Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.694753 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppl6c" event={"ID":"a3cc1d21-68c5-4820-a577-0e773ffc084b","Type":"ContainerDied","Data":"8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d"} Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.694794 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppl6c" event={"ID":"a3cc1d21-68c5-4820-a577-0e773ffc084b","Type":"ContainerDied","Data":"9eb115f6f417b110042ba5bcbebb6fa2f3d61329d615c5f307020de7edde6017"} Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.694824 4830 scope.go:117] "RemoveContainer" containerID="8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.695833 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppl6c" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.724126 4830 scope.go:117] "RemoveContainer" containerID="c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.739168 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppl6c"] Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.750474 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppl6c"] Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.762143 4830 scope.go:117] "RemoveContainer" containerID="46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.790667 4830 scope.go:117] "RemoveContainer" containerID="8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d" Mar 11 10:01:38 crc kubenswrapper[4830]: E0311 10:01:38.791510 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d\": container with ID starting with 8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d not found: ID does not exist" containerID="8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.791549 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d"} err="failed to get container status \"8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d\": rpc error: code = NotFound desc = could not find container \"8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d\": container with ID starting with 8d2a97b54b00afd686862840d457cfaa8d29b02d1cf85e837aebb6c1fb53626d not found: ID does not exist" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.791576 4830 scope.go:117] "RemoveContainer" containerID="c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066" Mar 11 10:01:38 crc kubenswrapper[4830]: E0311 10:01:38.791934 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066\": container with ID starting with c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066 not found: ID does not exist" containerID="c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.791964 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066"} err="failed to get container status \"c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066\": rpc error: code = NotFound desc = could not find container \"c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066\": container with ID starting with c0fee7f27029494944064b2bd8ff8d83459ef5bfd21bdeb7c7a7fef862498066 not found: ID does not exist" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.791981 4830 scope.go:117] "RemoveContainer" containerID="46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af" Mar 11 10:01:38 crc kubenswrapper[4830]: E0311 10:01:38.792384 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af\": container with ID starting with 46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af not found: ID does not exist" containerID="46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.792411 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af"} err="failed to get container status \"46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af\": rpc error: code = NotFound desc = could not find container \"46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af\": container with ID starting with 46ef517c9db75f20fe05f0316e0daa602807f33723bbc60b3791cf905c62e4af not found: ID does not exist" Mar 11 10:01:38 crc kubenswrapper[4830]: I0311 10:01:38.943464 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" path="/var/lib/kubelet/pods/a3cc1d21-68c5-4820-a577-0e773ffc084b/volumes" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.139458 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553722-s8mqq"] Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140377 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ab5666-8c5d-4e96-9c47-502bdc63bafb" containerName="keystone-cron" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140395 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ab5666-8c5d-4e96-9c47-502bdc63bafb" containerName="keystone-cron" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140418 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="extract-content" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140428 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="extract-content" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140446 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="extract-content" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140453 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="extract-content" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140465 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="extract-utilities" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140471 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="extract-utilities" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140490 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="extract-utilities" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140497 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="extract-utilities" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140521 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140529 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140544 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140551 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140563 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140570 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140581 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="extract-content" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140588 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="extract-content" Mar 11 10:02:00 crc kubenswrapper[4830]: E0311 10:02:00.140605 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="extract-utilities" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140612 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="extract-utilities" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140849 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaca749f-8f01-4e4b-8995-aa703a68324f" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140866 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ab5666-8c5d-4e96-9c47-502bdc63bafb" containerName="keystone-cron" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140882 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cc1d21-68c5-4820-a577-0e773ffc084b" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.140893 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e65ad73-b85e-46fd-bf92-533d92bb2c25" containerName="registry-server" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.141718 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553722-s8mqq" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.145189 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.145484 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.145638 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.153962 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553722-s8mqq"] Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.281815 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvwr7\" (UniqueName: \"kubernetes.io/projected/b2151157-cd28-49f2-8784-2ad2ade83346-kube-api-access-nvwr7\") pod \"auto-csr-approver-29553722-s8mqq\" (UID: \"b2151157-cd28-49f2-8784-2ad2ade83346\") " pod="openshift-infra/auto-csr-approver-29553722-s8mqq" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.408720 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvwr7\" (UniqueName: \"kubernetes.io/projected/b2151157-cd28-49f2-8784-2ad2ade83346-kube-api-access-nvwr7\") pod \"auto-csr-approver-29553722-s8mqq\" (UID: \"b2151157-cd28-49f2-8784-2ad2ade83346\") " pod="openshift-infra/auto-csr-approver-29553722-s8mqq" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.432473 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvwr7\" (UniqueName: \"kubernetes.io/projected/b2151157-cd28-49f2-8784-2ad2ade83346-kube-api-access-nvwr7\") pod \"auto-csr-approver-29553722-s8mqq\" (UID: \"b2151157-cd28-49f2-8784-2ad2ade83346\") " pod="openshift-infra/auto-csr-approver-29553722-s8mqq" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.466442 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553722-s8mqq" Mar 11 10:02:00 crc kubenswrapper[4830]: I0311 10:02:00.916521 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553722-s8mqq"] Mar 11 10:02:01 crc kubenswrapper[4830]: I0311 10:02:01.892971 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553722-s8mqq" event={"ID":"b2151157-cd28-49f2-8784-2ad2ade83346","Type":"ContainerStarted","Data":"06498c46475ca2bc1e74e3cd196201a91a60659296984d812107e0fc7fc38f4e"} Mar 11 10:02:02 crc kubenswrapper[4830]: I0311 10:02:02.903124 4830 generic.go:334] "Generic (PLEG): container finished" podID="b2151157-cd28-49f2-8784-2ad2ade83346" containerID="503f5696e2f4601293b08e3a83e18da5a99e4d3652c8daae431d749961cfa321" exitCode=0 Mar 11 10:02:02 crc kubenswrapper[4830]: I0311 10:02:02.903216 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553722-s8mqq" event={"ID":"b2151157-cd28-49f2-8784-2ad2ade83346","Type":"ContainerDied","Data":"503f5696e2f4601293b08e3a83e18da5a99e4d3652c8daae431d749961cfa321"} Mar 11 10:02:04 crc kubenswrapper[4830]: I0311 10:02:04.302932 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553722-s8mqq" Mar 11 10:02:04 crc kubenswrapper[4830]: I0311 10:02:04.483232 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvwr7\" (UniqueName: \"kubernetes.io/projected/b2151157-cd28-49f2-8784-2ad2ade83346-kube-api-access-nvwr7\") pod \"b2151157-cd28-49f2-8784-2ad2ade83346\" (UID: \"b2151157-cd28-49f2-8784-2ad2ade83346\") " Mar 11 10:02:04 crc kubenswrapper[4830]: I0311 10:02:04.490150 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2151157-cd28-49f2-8784-2ad2ade83346-kube-api-access-nvwr7" (OuterVolumeSpecName: "kube-api-access-nvwr7") pod "b2151157-cd28-49f2-8784-2ad2ade83346" (UID: "b2151157-cd28-49f2-8784-2ad2ade83346"). InnerVolumeSpecName "kube-api-access-nvwr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:02:04 crc kubenswrapper[4830]: I0311 10:02:04.586210 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvwr7\" (UniqueName: \"kubernetes.io/projected/b2151157-cd28-49f2-8784-2ad2ade83346-kube-api-access-nvwr7\") on node \"crc\" DevicePath \"\"" Mar 11 10:02:04 crc kubenswrapper[4830]: I0311 10:02:04.920539 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553722-s8mqq" event={"ID":"b2151157-cd28-49f2-8784-2ad2ade83346","Type":"ContainerDied","Data":"06498c46475ca2bc1e74e3cd196201a91a60659296984d812107e0fc7fc38f4e"} Mar 11 10:02:04 crc kubenswrapper[4830]: I0311 10:02:04.920576 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06498c46475ca2bc1e74e3cd196201a91a60659296984d812107e0fc7fc38f4e" Mar 11 10:02:04 crc kubenswrapper[4830]: I0311 10:02:04.920841 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553722-s8mqq" Mar 11 10:02:05 crc kubenswrapper[4830]: I0311 10:02:05.368206 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553716-95pb2"] Mar 11 10:02:05 crc kubenswrapper[4830]: I0311 10:02:05.381792 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553716-95pb2"] Mar 11 10:02:06 crc kubenswrapper[4830]: I0311 10:02:06.942646 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5844a434-4c88-45cb-83a1-8c82552f6fe0" path="/var/lib/kubelet/pods/5844a434-4c88-45cb-83a1-8c82552f6fe0/volumes" Mar 11 10:02:12 crc kubenswrapper[4830]: I0311 10:02:12.976165 4830 scope.go:117] "RemoveContainer" containerID="d6da0d0af3409c3fe80c4cef47396543558c833dc514a779d911e188e53f83a2" Mar 11 10:02:13 crc kubenswrapper[4830]: I0311 10:02:13.060836 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:02:13 crc kubenswrapper[4830]: I0311 10:02:13.060890 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:02:43 crc kubenswrapper[4830]: I0311 10:02:43.060484 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:02:43 crc kubenswrapper[4830]: I0311 10:02:43.061695 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.060349 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.060781 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.060824 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.061560 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.061614 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" gracePeriod=600 Mar 11 10:03:13 crc kubenswrapper[4830]: E0311 10:03:13.224934 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.561031 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" exitCode=0 Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.561073 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09"} Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.561103 4830 scope.go:117] "RemoveContainer" containerID="65b35d2c5b201abecd063ae3cba9aa95e395340f3759873b180ee4836ac11e36" Mar 11 10:03:13 crc kubenswrapper[4830]: I0311 10:03:13.561800 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:03:13 crc kubenswrapper[4830]: E0311 10:03:13.562248 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:03:26 crc kubenswrapper[4830]: I0311 10:03:26.932731 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:03:26 crc kubenswrapper[4830]: E0311 10:03:26.933699 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:03:40 crc kubenswrapper[4830]: I0311 10:03:40.933397 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:03:40 crc kubenswrapper[4830]: E0311 10:03:40.934426 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:03:51 crc kubenswrapper[4830]: I0311 10:03:51.932588 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:03:51 crc kubenswrapper[4830]: E0311 10:03:51.933443 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.146243 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553724-p2vbt"] Mar 11 10:04:00 crc kubenswrapper[4830]: E0311 10:04:00.147194 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2151157-cd28-49f2-8784-2ad2ade83346" containerName="oc" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.147208 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2151157-cd28-49f2-8784-2ad2ade83346" containerName="oc" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.147445 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2151157-cd28-49f2-8784-2ad2ade83346" containerName="oc" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.148130 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553724-p2vbt" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.150174 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.150726 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.151483 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.162087 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553724-p2vbt"] Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.200262 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtmnl\" (UniqueName: \"kubernetes.io/projected/e49eec21-b49a-4a1e-ac4d-e25faf688a65-kube-api-access-gtmnl\") pod \"auto-csr-approver-29553724-p2vbt\" (UID: \"e49eec21-b49a-4a1e-ac4d-e25faf688a65\") " pod="openshift-infra/auto-csr-approver-29553724-p2vbt" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.302478 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtmnl\" (UniqueName: \"kubernetes.io/projected/e49eec21-b49a-4a1e-ac4d-e25faf688a65-kube-api-access-gtmnl\") pod \"auto-csr-approver-29553724-p2vbt\" (UID: \"e49eec21-b49a-4a1e-ac4d-e25faf688a65\") " pod="openshift-infra/auto-csr-approver-29553724-p2vbt" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.324786 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtmnl\" (UniqueName: \"kubernetes.io/projected/e49eec21-b49a-4a1e-ac4d-e25faf688a65-kube-api-access-gtmnl\") pod \"auto-csr-approver-29553724-p2vbt\" (UID: \"e49eec21-b49a-4a1e-ac4d-e25faf688a65\") " pod="openshift-infra/auto-csr-approver-29553724-p2vbt" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.471868 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553724-p2vbt" Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.951777 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553724-p2vbt"] Mar 11 10:04:00 crc kubenswrapper[4830]: I0311 10:04:00.971328 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553724-p2vbt" event={"ID":"e49eec21-b49a-4a1e-ac4d-e25faf688a65","Type":"ContainerStarted","Data":"71d47db13864e108d0c890de8300c148f4f0d97462c851f94f440be1ef20366f"} Mar 11 10:04:02 crc kubenswrapper[4830]: I0311 10:04:02.940170 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:04:02 crc kubenswrapper[4830]: E0311 10:04:02.940977 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:04:02 crc kubenswrapper[4830]: I0311 10:04:02.991429 4830 generic.go:334] "Generic (PLEG): container finished" podID="e49eec21-b49a-4a1e-ac4d-e25faf688a65" containerID="108ecbc98b84e43defee76b55254dede7601a789955fb2038ce6fd34fbbb3489" exitCode=0 Mar 11 10:04:02 crc kubenswrapper[4830]: I0311 10:04:02.991478 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553724-p2vbt" event={"ID":"e49eec21-b49a-4a1e-ac4d-e25faf688a65","Type":"ContainerDied","Data":"108ecbc98b84e43defee76b55254dede7601a789955fb2038ce6fd34fbbb3489"} Mar 11 10:04:04 crc kubenswrapper[4830]: I0311 10:04:04.408732 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553724-p2vbt" Mar 11 10:04:04 crc kubenswrapper[4830]: I0311 10:04:04.483139 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtmnl\" (UniqueName: \"kubernetes.io/projected/e49eec21-b49a-4a1e-ac4d-e25faf688a65-kube-api-access-gtmnl\") pod \"e49eec21-b49a-4a1e-ac4d-e25faf688a65\" (UID: \"e49eec21-b49a-4a1e-ac4d-e25faf688a65\") " Mar 11 10:04:04 crc kubenswrapper[4830]: I0311 10:04:04.489635 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49eec21-b49a-4a1e-ac4d-e25faf688a65-kube-api-access-gtmnl" (OuterVolumeSpecName: "kube-api-access-gtmnl") pod "e49eec21-b49a-4a1e-ac4d-e25faf688a65" (UID: "e49eec21-b49a-4a1e-ac4d-e25faf688a65"). InnerVolumeSpecName "kube-api-access-gtmnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:04:04 crc kubenswrapper[4830]: I0311 10:04:04.585897 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtmnl\" (UniqueName: \"kubernetes.io/projected/e49eec21-b49a-4a1e-ac4d-e25faf688a65-kube-api-access-gtmnl\") on node \"crc\" DevicePath \"\"" Mar 11 10:04:05 crc kubenswrapper[4830]: I0311 10:04:05.008848 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553724-p2vbt" event={"ID":"e49eec21-b49a-4a1e-ac4d-e25faf688a65","Type":"ContainerDied","Data":"71d47db13864e108d0c890de8300c148f4f0d97462c851f94f440be1ef20366f"} Mar 11 10:04:05 crc kubenswrapper[4830]: I0311 10:04:05.008894 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d47db13864e108d0c890de8300c148f4f0d97462c851f94f440be1ef20366f" Mar 11 10:04:05 crc kubenswrapper[4830]: I0311 10:04:05.008944 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553724-p2vbt" Mar 11 10:04:05 crc kubenswrapper[4830]: I0311 10:04:05.469209 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553718-5wskx"] Mar 11 10:04:05 crc kubenswrapper[4830]: I0311 10:04:05.477074 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553718-5wskx"] Mar 11 10:04:06 crc kubenswrapper[4830]: I0311 10:04:06.944800 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e97f930-5cd6-46d9-a168-f1a3b4f98936" path="/var/lib/kubelet/pods/2e97f930-5cd6-46d9-a168-f1a3b4f98936/volumes" Mar 11 10:04:13 crc kubenswrapper[4830]: I0311 10:04:13.119114 4830 scope.go:117] "RemoveContainer" containerID="90ca592dd5ee7483487dfd489dcde7f0a8aa3ef1a2f95a8b93402fc5f3fb792c" Mar 11 10:04:16 crc kubenswrapper[4830]: I0311 10:04:16.933315 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:04:16 crc kubenswrapper[4830]: E0311 10:04:16.934370 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:04:30 crc kubenswrapper[4830]: I0311 10:04:30.932691 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:04:30 crc kubenswrapper[4830]: E0311 10:04:30.933567 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:04:42 crc kubenswrapper[4830]: I0311 10:04:42.947770 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:04:42 crc kubenswrapper[4830]: E0311 10:04:42.949174 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:04:55 crc kubenswrapper[4830]: I0311 10:04:55.932865 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:04:55 crc kubenswrapper[4830]: E0311 10:04:55.933555 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:05:09 crc kubenswrapper[4830]: I0311 10:05:09.932418 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:05:09 crc kubenswrapper[4830]: E0311 10:05:09.933230 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.575899 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dd7mz"] Mar 11 10:05:20 crc kubenswrapper[4830]: E0311 10:05:20.577094 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49eec21-b49a-4a1e-ac4d-e25faf688a65" containerName="oc" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.577111 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49eec21-b49a-4a1e-ac4d-e25faf688a65" containerName="oc" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.577341 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49eec21-b49a-4a1e-ac4d-e25faf688a65" containerName="oc" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.579066 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.586740 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dd7mz"] Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.721651 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-catalog-content\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.722141 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-utilities\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.722479 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmss\" (UniqueName: \"kubernetes.io/projected/fc00cff5-6519-4117-bd75-ab0a46a543d3-kube-api-access-ftmss\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.825463 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-utilities\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.826062 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmss\" (UniqueName: \"kubernetes.io/projected/fc00cff5-6519-4117-bd75-ab0a46a543d3-kube-api-access-ftmss\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.826233 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-utilities\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.826461 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-catalog-content\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.827152 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-catalog-content\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.861310 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmss\" (UniqueName: \"kubernetes.io/projected/fc00cff5-6519-4117-bd75-ab0a46a543d3-kube-api-access-ftmss\") pod \"community-operators-dd7mz\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:20 crc kubenswrapper[4830]: I0311 10:05:20.905551 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:21 crc kubenswrapper[4830]: I0311 10:05:21.482544 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dd7mz"] Mar 11 10:05:21 crc kubenswrapper[4830]: I0311 10:05:21.686900 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7mz" event={"ID":"fc00cff5-6519-4117-bd75-ab0a46a543d3","Type":"ContainerStarted","Data":"d030ea147e53d99fe81ea689ea3fa96844ea6fb6c06dded03c3851ccaecde47c"} Mar 11 10:05:21 crc kubenswrapper[4830]: I0311 10:05:21.933423 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:05:21 crc kubenswrapper[4830]: E0311 10:05:21.933678 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:05:22 crc kubenswrapper[4830]: I0311 10:05:22.696613 4830 generic.go:334] "Generic (PLEG): container finished" podID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerID="675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef" exitCode=0 Mar 11 10:05:22 crc kubenswrapper[4830]: I0311 10:05:22.696717 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7mz" event={"ID":"fc00cff5-6519-4117-bd75-ab0a46a543d3","Type":"ContainerDied","Data":"675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef"} Mar 11 10:05:24 crc kubenswrapper[4830]: I0311 10:05:24.716777 4830 generic.go:334] "Generic (PLEG): container finished" podID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerID="b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2" exitCode=0 Mar 11 10:05:24 crc kubenswrapper[4830]: I0311 10:05:24.716883 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7mz" event={"ID":"fc00cff5-6519-4117-bd75-ab0a46a543d3","Type":"ContainerDied","Data":"b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2"} Mar 11 10:05:25 crc kubenswrapper[4830]: I0311 10:05:25.727034 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7mz" event={"ID":"fc00cff5-6519-4117-bd75-ab0a46a543d3","Type":"ContainerStarted","Data":"f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a"} Mar 11 10:05:25 crc kubenswrapper[4830]: I0311 10:05:25.753830 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dd7mz" podStartSLOduration=3.161886222 podStartE2EDuration="5.753809998s" podCreationTimestamp="2026-03-11 10:05:20 +0000 UTC" firstStartedPulling="2026-03-11 10:05:22.69919836 +0000 UTC m=+3090.480349049" lastFinishedPulling="2026-03-11 10:05:25.291122136 +0000 UTC m=+3093.072272825" observedRunningTime="2026-03-11 10:05:25.747308602 +0000 UTC m=+3093.528459301" watchObservedRunningTime="2026-03-11 10:05:25.753809998 +0000 UTC m=+3093.534960687" Mar 11 10:05:30 crc kubenswrapper[4830]: I0311 10:05:30.906007 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:30 crc kubenswrapper[4830]: I0311 10:05:30.906599 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:30 crc kubenswrapper[4830]: I0311 10:05:30.954380 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:31 crc kubenswrapper[4830]: I0311 10:05:31.824377 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:31 crc kubenswrapper[4830]: I0311 10:05:31.874729 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dd7mz"] Mar 11 10:05:33 crc kubenswrapper[4830]: I0311 10:05:33.794692 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dd7mz" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="registry-server" containerID="cri-o://f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a" gracePeriod=2 Mar 11 10:05:33 crc kubenswrapper[4830]: I0311 10:05:33.932644 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:05:33 crc kubenswrapper[4830]: E0311 10:05:33.933397 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.336922 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.529141 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-catalog-content\") pod \"fc00cff5-6519-4117-bd75-ab0a46a543d3\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.529395 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-utilities\") pod \"fc00cff5-6519-4117-bd75-ab0a46a543d3\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.529528 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftmss\" (UniqueName: \"kubernetes.io/projected/fc00cff5-6519-4117-bd75-ab0a46a543d3-kube-api-access-ftmss\") pod \"fc00cff5-6519-4117-bd75-ab0a46a543d3\" (UID: \"fc00cff5-6519-4117-bd75-ab0a46a543d3\") " Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.530243 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-utilities" (OuterVolumeSpecName: "utilities") pod "fc00cff5-6519-4117-bd75-ab0a46a543d3" (UID: "fc00cff5-6519-4117-bd75-ab0a46a543d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.534942 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc00cff5-6519-4117-bd75-ab0a46a543d3-kube-api-access-ftmss" (OuterVolumeSpecName: "kube-api-access-ftmss") pod "fc00cff5-6519-4117-bd75-ab0a46a543d3" (UID: "fc00cff5-6519-4117-bd75-ab0a46a543d3"). InnerVolumeSpecName "kube-api-access-ftmss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.587830 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc00cff5-6519-4117-bd75-ab0a46a543d3" (UID: "fc00cff5-6519-4117-bd75-ab0a46a543d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.631615 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.631654 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc00cff5-6519-4117-bd75-ab0a46a543d3-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.631663 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftmss\" (UniqueName: \"kubernetes.io/projected/fc00cff5-6519-4117-bd75-ab0a46a543d3-kube-api-access-ftmss\") on node \"crc\" DevicePath \"\"" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.804715 4830 generic.go:334] "Generic (PLEG): container finished" podID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerID="f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a" exitCode=0 Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.804768 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7mz" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.804790 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7mz" event={"ID":"fc00cff5-6519-4117-bd75-ab0a46a543d3","Type":"ContainerDied","Data":"f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a"} Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.806095 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7mz" event={"ID":"fc00cff5-6519-4117-bd75-ab0a46a543d3","Type":"ContainerDied","Data":"d030ea147e53d99fe81ea689ea3fa96844ea6fb6c06dded03c3851ccaecde47c"} Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.806124 4830 scope.go:117] "RemoveContainer" containerID="f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.833926 4830 scope.go:117] "RemoveContainer" containerID="b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.838932 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dd7mz"] Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.861929 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dd7mz"] Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.870415 4830 scope.go:117] "RemoveContainer" containerID="675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.898590 4830 scope.go:117] "RemoveContainer" containerID="f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a" Mar 11 10:05:34 crc kubenswrapper[4830]: E0311 10:05:34.899219 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a\": container with ID starting with f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a not found: ID does not exist" containerID="f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.899274 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a"} err="failed to get container status \"f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a\": rpc error: code = NotFound desc = could not find container \"f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a\": container with ID starting with f98a2e4fe547926a8e9d0f5d87895bd4b93479efcf95b49eb3fd5143cc28f44a not found: ID does not exist" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.899298 4830 scope.go:117] "RemoveContainer" containerID="b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2" Mar 11 10:05:34 crc kubenswrapper[4830]: E0311 10:05:34.899641 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2\": container with ID starting with b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2 not found: ID does not exist" containerID="b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.899684 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2"} err="failed to get container status \"b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2\": rpc error: code = NotFound desc = could not find container \"b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2\": container with ID starting with b08298a51734ccbf2e7c9087d781b3590b2791f6dbe7b82315538dd28744a3a2 not found: ID does not exist" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.899699 4830 scope.go:117] "RemoveContainer" containerID="675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef" Mar 11 10:05:34 crc kubenswrapper[4830]: E0311 10:05:34.899953 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef\": container with ID starting with 675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef not found: ID does not exist" containerID="675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.899974 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef"} err="failed to get container status \"675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef\": rpc error: code = NotFound desc = could not find container \"675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef\": container with ID starting with 675cff1a264cd2564542dadf1bda8d6ff69e0815b3d7b5bd379e45de392180ef not found: ID does not exist" Mar 11 10:05:34 crc kubenswrapper[4830]: I0311 10:05:34.944382 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" path="/var/lib/kubelet/pods/fc00cff5-6519-4117-bd75-ab0a46a543d3/volumes" Mar 11 10:05:48 crc kubenswrapper[4830]: I0311 10:05:48.932355 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:05:48 crc kubenswrapper[4830]: E0311 10:05:48.933354 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.145168 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553726-d6mz7"] Mar 11 10:06:00 crc kubenswrapper[4830]: E0311 10:06:00.146298 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="extract-utilities" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.146316 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="extract-utilities" Mar 11 10:06:00 crc kubenswrapper[4830]: E0311 10:06:00.146355 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="registry-server" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.146363 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="registry-server" Mar 11 10:06:00 crc kubenswrapper[4830]: E0311 10:06:00.146379 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="extract-content" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.146391 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="extract-content" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.146623 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc00cff5-6519-4117-bd75-ab0a46a543d3" containerName="registry-server" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.147401 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.149713 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.150117 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.150166 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.155744 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553726-d6mz7"] Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.233204 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt8pj\" (UniqueName: \"kubernetes.io/projected/8ce964b2-b5cc-4ba2-b9a4-c2c600feb743-kube-api-access-jt8pj\") pod \"auto-csr-approver-29553726-d6mz7\" (UID: \"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743\") " pod="openshift-infra/auto-csr-approver-29553726-d6mz7" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.334839 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt8pj\" (UniqueName: \"kubernetes.io/projected/8ce964b2-b5cc-4ba2-b9a4-c2c600feb743-kube-api-access-jt8pj\") pod \"auto-csr-approver-29553726-d6mz7\" (UID: \"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743\") " pod="openshift-infra/auto-csr-approver-29553726-d6mz7" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.355984 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt8pj\" (UniqueName: \"kubernetes.io/projected/8ce964b2-b5cc-4ba2-b9a4-c2c600feb743-kube-api-access-jt8pj\") pod \"auto-csr-approver-29553726-d6mz7\" (UID: \"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743\") " pod="openshift-infra/auto-csr-approver-29553726-d6mz7" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.475200 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.919061 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553726-d6mz7"] Mar 11 10:06:00 crc kubenswrapper[4830]: I0311 10:06:00.932788 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:06:00 crc kubenswrapper[4830]: E0311 10:06:00.933160 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:06:01 crc kubenswrapper[4830]: I0311 10:06:01.026719 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" event={"ID":"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743","Type":"ContainerStarted","Data":"47337d426880f637c5c5d20e3232ca72f97ac21515647b2a70419ac2e4b2d24d"} Mar 11 10:06:03 crc kubenswrapper[4830]: I0311 10:06:03.044943 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" event={"ID":"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743","Type":"ContainerStarted","Data":"ab2a7aa6993aa7bc747d02ca521d17a55fee5bc963aa7891814bcd687f34c96a"} Mar 11 10:06:03 crc kubenswrapper[4830]: I0311 10:06:03.062671 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" podStartSLOduration=1.402894295 podStartE2EDuration="3.062630631s" podCreationTimestamp="2026-03-11 10:06:00 +0000 UTC" firstStartedPulling="2026-03-11 10:06:00.916454519 +0000 UTC m=+3128.697605208" lastFinishedPulling="2026-03-11 10:06:02.576190855 +0000 UTC m=+3130.357341544" observedRunningTime="2026-03-11 10:06:03.057578404 +0000 UTC m=+3130.838729103" watchObservedRunningTime="2026-03-11 10:06:03.062630631 +0000 UTC m=+3130.843781320" Mar 11 10:06:04 crc kubenswrapper[4830]: I0311 10:06:04.054465 4830 generic.go:334] "Generic (PLEG): container finished" podID="8ce964b2-b5cc-4ba2-b9a4-c2c600feb743" containerID="ab2a7aa6993aa7bc747d02ca521d17a55fee5bc963aa7891814bcd687f34c96a" exitCode=0 Mar 11 10:06:04 crc kubenswrapper[4830]: I0311 10:06:04.054517 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" event={"ID":"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743","Type":"ContainerDied","Data":"ab2a7aa6993aa7bc747d02ca521d17a55fee5bc963aa7891814bcd687f34c96a"} Mar 11 10:06:05 crc kubenswrapper[4830]: I0311 10:06:05.378121 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" Mar 11 10:06:05 crc kubenswrapper[4830]: I0311 10:06:05.531441 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt8pj\" (UniqueName: \"kubernetes.io/projected/8ce964b2-b5cc-4ba2-b9a4-c2c600feb743-kube-api-access-jt8pj\") pod \"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743\" (UID: \"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743\") " Mar 11 10:06:05 crc kubenswrapper[4830]: I0311 10:06:05.538887 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce964b2-b5cc-4ba2-b9a4-c2c600feb743-kube-api-access-jt8pj" (OuterVolumeSpecName: "kube-api-access-jt8pj") pod "8ce964b2-b5cc-4ba2-b9a4-c2c600feb743" (UID: "8ce964b2-b5cc-4ba2-b9a4-c2c600feb743"). InnerVolumeSpecName "kube-api-access-jt8pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:06:05 crc kubenswrapper[4830]: I0311 10:06:05.633390 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt8pj\" (UniqueName: \"kubernetes.io/projected/8ce964b2-b5cc-4ba2-b9a4-c2c600feb743-kube-api-access-jt8pj\") on node \"crc\" DevicePath \"\"" Mar 11 10:06:06 crc kubenswrapper[4830]: I0311 10:06:06.024006 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553720-5dnqn"] Mar 11 10:06:06 crc kubenswrapper[4830]: I0311 10:06:06.032979 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553720-5dnqn"] Mar 11 10:06:06 crc kubenswrapper[4830]: I0311 10:06:06.070973 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" event={"ID":"8ce964b2-b5cc-4ba2-b9a4-c2c600feb743","Type":"ContainerDied","Data":"47337d426880f637c5c5d20e3232ca72f97ac21515647b2a70419ac2e4b2d24d"} Mar 11 10:06:06 crc kubenswrapper[4830]: I0311 10:06:06.071043 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47337d426880f637c5c5d20e3232ca72f97ac21515647b2a70419ac2e4b2d24d" Mar 11 10:06:06 crc kubenswrapper[4830]: I0311 10:06:06.071100 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553726-d6mz7" Mar 11 10:06:06 crc kubenswrapper[4830]: I0311 10:06:06.944402 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1fe71e-7b7f-439e-9d8f-68743095b1a7" path="/var/lib/kubelet/pods/7f1fe71e-7b7f-439e-9d8f-68743095b1a7/volumes" Mar 11 10:06:13 crc kubenswrapper[4830]: I0311 10:06:13.232667 4830 scope.go:117] "RemoveContainer" containerID="b99227ff704a4d01d8d2bac7f81921f727f758b3aa979d157354dff1cc0c27ac" Mar 11 10:06:13 crc kubenswrapper[4830]: I0311 10:06:13.932471 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:06:13 crc kubenswrapper[4830]: E0311 10:06:13.933050 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:06:26 crc kubenswrapper[4830]: I0311 10:06:26.932905 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:06:26 crc kubenswrapper[4830]: E0311 10:06:26.934107 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:06:37 crc kubenswrapper[4830]: I0311 10:06:37.933130 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:06:37 crc kubenswrapper[4830]: E0311 10:06:37.934069 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:06:52 crc kubenswrapper[4830]: I0311 10:06:52.941779 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:06:52 crc kubenswrapper[4830]: E0311 10:06:52.942586 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:07:07 crc kubenswrapper[4830]: I0311 10:07:07.932850 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:07:07 crc kubenswrapper[4830]: E0311 10:07:07.933574 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:07:19 crc kubenswrapper[4830]: I0311 10:07:19.932230 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:07:19 crc kubenswrapper[4830]: E0311 10:07:19.933129 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:07:34 crc kubenswrapper[4830]: I0311 10:07:34.932821 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:07:34 crc kubenswrapper[4830]: E0311 10:07:34.933599 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:07:45 crc kubenswrapper[4830]: I0311 10:07:45.932909 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:07:45 crc kubenswrapper[4830]: E0311 10:07:45.933832 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:07:58 crc kubenswrapper[4830]: I0311 10:07:58.937295 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:07:58 crc kubenswrapper[4830]: E0311 10:07:58.938075 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.179502 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553728-v7dwh"] Mar 11 10:08:00 crc kubenswrapper[4830]: E0311 10:08:00.179976 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce964b2-b5cc-4ba2-b9a4-c2c600feb743" containerName="oc" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.179995 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce964b2-b5cc-4ba2-b9a4-c2c600feb743" containerName="oc" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.180233 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ce964b2-b5cc-4ba2-b9a4-c2c600feb743" containerName="oc" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.180939 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553728-v7dwh" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.184796 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.187229 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.189039 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.189895 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553728-v7dwh"] Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.331180 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlwg\" (UniqueName: \"kubernetes.io/projected/47589ba4-9dbf-430b-b1f2-c1507faeab74-kube-api-access-mmlwg\") pod \"auto-csr-approver-29553728-v7dwh\" (UID: \"47589ba4-9dbf-430b-b1f2-c1507faeab74\") " pod="openshift-infra/auto-csr-approver-29553728-v7dwh" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.433482 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlwg\" (UniqueName: \"kubernetes.io/projected/47589ba4-9dbf-430b-b1f2-c1507faeab74-kube-api-access-mmlwg\") pod \"auto-csr-approver-29553728-v7dwh\" (UID: \"47589ba4-9dbf-430b-b1f2-c1507faeab74\") " pod="openshift-infra/auto-csr-approver-29553728-v7dwh" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.455967 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlwg\" (UniqueName: \"kubernetes.io/projected/47589ba4-9dbf-430b-b1f2-c1507faeab74-kube-api-access-mmlwg\") pod \"auto-csr-approver-29553728-v7dwh\" (UID: \"47589ba4-9dbf-430b-b1f2-c1507faeab74\") " pod="openshift-infra/auto-csr-approver-29553728-v7dwh" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.504534 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553728-v7dwh" Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.935057 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 10:08:00 crc kubenswrapper[4830]: I0311 10:08:00.951623 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553728-v7dwh"] Mar 11 10:08:01 crc kubenswrapper[4830]: I0311 10:08:01.046407 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553728-v7dwh" event={"ID":"47589ba4-9dbf-430b-b1f2-c1507faeab74","Type":"ContainerStarted","Data":"78e153fa1d3cfec5f2a6b93fe4da5770408e3507977b147518469996855a2150"} Mar 11 10:08:03 crc kubenswrapper[4830]: I0311 10:08:03.082905 4830 generic.go:334] "Generic (PLEG): container finished" podID="47589ba4-9dbf-430b-b1f2-c1507faeab74" containerID="f250a6e273286fdb98ef1e5df4c3044cb79a3d7b4e7b3a11a1a566ed785cd1d1" exitCode=0 Mar 11 10:08:03 crc kubenswrapper[4830]: I0311 10:08:03.083144 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553728-v7dwh" event={"ID":"47589ba4-9dbf-430b-b1f2-c1507faeab74","Type":"ContainerDied","Data":"f250a6e273286fdb98ef1e5df4c3044cb79a3d7b4e7b3a11a1a566ed785cd1d1"} Mar 11 10:08:04 crc kubenswrapper[4830]: I0311 10:08:04.524438 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553728-v7dwh" Mar 11 10:08:04 crc kubenswrapper[4830]: I0311 10:08:04.619484 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmlwg\" (UniqueName: \"kubernetes.io/projected/47589ba4-9dbf-430b-b1f2-c1507faeab74-kube-api-access-mmlwg\") pod \"47589ba4-9dbf-430b-b1f2-c1507faeab74\" (UID: \"47589ba4-9dbf-430b-b1f2-c1507faeab74\") " Mar 11 10:08:04 crc kubenswrapper[4830]: I0311 10:08:04.625295 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47589ba4-9dbf-430b-b1f2-c1507faeab74-kube-api-access-mmlwg" (OuterVolumeSpecName: "kube-api-access-mmlwg") pod "47589ba4-9dbf-430b-b1f2-c1507faeab74" (UID: "47589ba4-9dbf-430b-b1f2-c1507faeab74"). InnerVolumeSpecName "kube-api-access-mmlwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:08:04 crc kubenswrapper[4830]: I0311 10:08:04.721951 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmlwg\" (UniqueName: \"kubernetes.io/projected/47589ba4-9dbf-430b-b1f2-c1507faeab74-kube-api-access-mmlwg\") on node \"crc\" DevicePath \"\"" Mar 11 10:08:05 crc kubenswrapper[4830]: I0311 10:08:05.104332 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553728-v7dwh" event={"ID":"47589ba4-9dbf-430b-b1f2-c1507faeab74","Type":"ContainerDied","Data":"78e153fa1d3cfec5f2a6b93fe4da5770408e3507977b147518469996855a2150"} Mar 11 10:08:05 crc kubenswrapper[4830]: I0311 10:08:05.104377 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553728-v7dwh" Mar 11 10:08:05 crc kubenswrapper[4830]: I0311 10:08:05.104378 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78e153fa1d3cfec5f2a6b93fe4da5770408e3507977b147518469996855a2150" Mar 11 10:08:05 crc kubenswrapper[4830]: I0311 10:08:05.586838 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553722-s8mqq"] Mar 11 10:08:05 crc kubenswrapper[4830]: I0311 10:08:05.594600 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553722-s8mqq"] Mar 11 10:08:06 crc kubenswrapper[4830]: I0311 10:08:06.944629 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2151157-cd28-49f2-8784-2ad2ade83346" path="/var/lib/kubelet/pods/b2151157-cd28-49f2-8784-2ad2ade83346/volumes" Mar 11 10:08:13 crc kubenswrapper[4830]: I0311 10:08:13.337429 4830 scope.go:117] "RemoveContainer" containerID="503f5696e2f4601293b08e3a83e18da5a99e4d3652c8daae431d749961cfa321" Mar 11 10:08:13 crc kubenswrapper[4830]: I0311 10:08:13.932992 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:08:14 crc kubenswrapper[4830]: I0311 10:08:14.191363 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"39b77e2518e07260183281640d393fb923092c241d46b9fda5255d21c4ba975d"} Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.145287 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553730-r2zzj"] Mar 11 10:10:00 crc kubenswrapper[4830]: E0311 10:10:00.147933 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47589ba4-9dbf-430b-b1f2-c1507faeab74" containerName="oc" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.148078 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="47589ba4-9dbf-430b-b1f2-c1507faeab74" containerName="oc" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.148528 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="47589ba4-9dbf-430b-b1f2-c1507faeab74" containerName="oc" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.149571 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553730-r2zzj" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.151982 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.152827 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.152825 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.155039 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553730-r2zzj"] Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.291582 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p57dw\" (UniqueName: \"kubernetes.io/projected/2b362a55-c049-48f8-9c1c-ed0f66ef4bc4-kube-api-access-p57dw\") pod \"auto-csr-approver-29553730-r2zzj\" (UID: \"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4\") " pod="openshift-infra/auto-csr-approver-29553730-r2zzj" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.394134 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p57dw\" (UniqueName: \"kubernetes.io/projected/2b362a55-c049-48f8-9c1c-ed0f66ef4bc4-kube-api-access-p57dw\") pod \"auto-csr-approver-29553730-r2zzj\" (UID: \"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4\") " pod="openshift-infra/auto-csr-approver-29553730-r2zzj" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.416411 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p57dw\" (UniqueName: \"kubernetes.io/projected/2b362a55-c049-48f8-9c1c-ed0f66ef4bc4-kube-api-access-p57dw\") pod \"auto-csr-approver-29553730-r2zzj\" (UID: \"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4\") " pod="openshift-infra/auto-csr-approver-29553730-r2zzj" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.470677 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553730-r2zzj" Mar 11 10:10:00 crc kubenswrapper[4830]: I0311 10:10:00.906714 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553730-r2zzj"] Mar 11 10:10:01 crc kubenswrapper[4830]: I0311 10:10:01.074781 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553730-r2zzj" event={"ID":"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4","Type":"ContainerStarted","Data":"26294eb4b9b08fda241e51bf0c9df55757671274ddae967c5f58180cc18f11d0"} Mar 11 10:10:04 crc kubenswrapper[4830]: I0311 10:10:04.113902 4830 generic.go:334] "Generic (PLEG): container finished" podID="2b362a55-c049-48f8-9c1c-ed0f66ef4bc4" containerID="bc33086c0384153e4204af1273c8a6fde7849a557a44b575c216d0b2aa047d6d" exitCode=0 Mar 11 10:10:04 crc kubenswrapper[4830]: I0311 10:10:04.114502 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553730-r2zzj" event={"ID":"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4","Type":"ContainerDied","Data":"bc33086c0384153e4204af1273c8a6fde7849a557a44b575c216d0b2aa047d6d"} Mar 11 10:10:05 crc kubenswrapper[4830]: I0311 10:10:05.518429 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553730-r2zzj" Mar 11 10:10:05 crc kubenswrapper[4830]: I0311 10:10:05.590844 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p57dw\" (UniqueName: \"kubernetes.io/projected/2b362a55-c049-48f8-9c1c-ed0f66ef4bc4-kube-api-access-p57dw\") pod \"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4\" (UID: \"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4\") " Mar 11 10:10:05 crc kubenswrapper[4830]: I0311 10:10:05.610987 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b362a55-c049-48f8-9c1c-ed0f66ef4bc4-kube-api-access-p57dw" (OuterVolumeSpecName: "kube-api-access-p57dw") pod "2b362a55-c049-48f8-9c1c-ed0f66ef4bc4" (UID: "2b362a55-c049-48f8-9c1c-ed0f66ef4bc4"). InnerVolumeSpecName "kube-api-access-p57dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:10:05 crc kubenswrapper[4830]: I0311 10:10:05.693571 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p57dw\" (UniqueName: \"kubernetes.io/projected/2b362a55-c049-48f8-9c1c-ed0f66ef4bc4-kube-api-access-p57dw\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:06 crc kubenswrapper[4830]: I0311 10:10:06.131963 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553730-r2zzj" event={"ID":"2b362a55-c049-48f8-9c1c-ed0f66ef4bc4","Type":"ContainerDied","Data":"26294eb4b9b08fda241e51bf0c9df55757671274ddae967c5f58180cc18f11d0"} Mar 11 10:10:06 crc kubenswrapper[4830]: I0311 10:10:06.132710 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26294eb4b9b08fda241e51bf0c9df55757671274ddae967c5f58180cc18f11d0" Mar 11 10:10:06 crc kubenswrapper[4830]: I0311 10:10:06.132028 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553730-r2zzj" Mar 11 10:10:06 crc kubenswrapper[4830]: I0311 10:10:06.588709 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553724-p2vbt"] Mar 11 10:10:06 crc kubenswrapper[4830]: I0311 10:10:06.596855 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553724-p2vbt"] Mar 11 10:10:06 crc kubenswrapper[4830]: I0311 10:10:06.945477 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49eec21-b49a-4a1e-ac4d-e25faf688a65" path="/var/lib/kubelet/pods/e49eec21-b49a-4a1e-ac4d-e25faf688a65/volumes" Mar 11 10:10:13 crc kubenswrapper[4830]: I0311 10:10:13.451927 4830 scope.go:117] "RemoveContainer" containerID="108ecbc98b84e43defee76b55254dede7601a789955fb2038ce6fd34fbbb3489" Mar 11 10:10:29 crc kubenswrapper[4830]: I0311 10:10:29.352756 4830 generic.go:334] "Generic (PLEG): container finished" podID="e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" containerID="89e020fe3c13a5fb6b5d0340eb81add04cdc9a53e94a5dc09f419fd5141d94e7" exitCode=0 Mar 11 10:10:29 crc kubenswrapper[4830]: I0311 10:10:29.352852 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3","Type":"ContainerDied","Data":"89e020fe3c13a5fb6b5d0340eb81add04cdc9a53e94a5dc09f419fd5141d94e7"} Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.741256 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.906967 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config-secret\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907045 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907174 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907237 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-workdir\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907275 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-config-data\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907315 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ssh-key\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907367 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2vl\" (UniqueName: \"kubernetes.io/projected/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-kube-api-access-mq2vl\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907402 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ca-certs\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.907433 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-temporary\") pod \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\" (UID: \"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3\") " Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.908489 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-config-data" (OuterVolumeSpecName: "config-data") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.908561 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.909171 4830 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.909200 4830 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.914629 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.914879 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-kube-api-access-mq2vl" (OuterVolumeSpecName: "kube-api-access-mq2vl") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "kube-api-access-mq2vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.917144 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.938104 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.939570 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.940142 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:10:30 crc kubenswrapper[4830]: I0311 10:10:30.999722 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" (UID: "e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.011758 4830 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.011802 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.011846 4830 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.011861 4830 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.011876 4830 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.011888 4830 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.011900 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2vl\" (UniqueName: \"kubernetes.io/projected/e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3-kube-api-access-mq2vl\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.035686 4830 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.114009 4830 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.370450 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3","Type":"ContainerDied","Data":"470b1c13ad588d9c089141c5faf08022da684564d2105ff76b7147bfa5cbe7a7"} Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.370994 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="470b1c13ad588d9c089141c5faf08022da684564d2105ff76b7147bfa5cbe7a7" Mar 11 10:10:31 crc kubenswrapper[4830]: I0311 10:10:31.370510 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.888241 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 11 10:10:42 crc kubenswrapper[4830]: E0311 10:10:42.889476 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b362a55-c049-48f8-9c1c-ed0f66ef4bc4" containerName="oc" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.889494 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b362a55-c049-48f8-9c1c-ed0f66ef4bc4" containerName="oc" Mar 11 10:10:42 crc kubenswrapper[4830]: E0311 10:10:42.889526 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" containerName="tempest-tests-tempest-tests-runner" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.889535 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" containerName="tempest-tests-tempest-tests-runner" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.889769 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3" containerName="tempest-tests-tempest-tests-runner" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.889798 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b362a55-c049-48f8-9c1c-ed0f66ef4bc4" containerName="oc" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.890674 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.893140 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c5jjz" Mar 11 10:10:42 crc kubenswrapper[4830]: I0311 10:10:42.897648 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.060214 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.060270 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.062273 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llngm\" (UniqueName: \"kubernetes.io/projected/c01f3e65-5b46-4373-a420-2d966d66a081-kube-api-access-llngm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c01f3e65-5b46-4373-a420-2d966d66a081\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.062430 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c01f3e65-5b46-4373-a420-2d966d66a081\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.164489 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c01f3e65-5b46-4373-a420-2d966d66a081\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.164651 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llngm\" (UniqueName: \"kubernetes.io/projected/c01f3e65-5b46-4373-a420-2d966d66a081-kube-api-access-llngm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c01f3e65-5b46-4373-a420-2d966d66a081\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.166154 4830 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c01f3e65-5b46-4373-a420-2d966d66a081\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.188471 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llngm\" (UniqueName: \"kubernetes.io/projected/c01f3e65-5b46-4373-a420-2d966d66a081-kube-api-access-llngm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c01f3e65-5b46-4373-a420-2d966d66a081\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.205205 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c01f3e65-5b46-4373-a420-2d966d66a081\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.221504 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 11 10:10:43 crc kubenswrapper[4830]: I0311 10:10:43.656508 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 11 10:10:44 crc kubenswrapper[4830]: I0311 10:10:44.487409 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c01f3e65-5b46-4373-a420-2d966d66a081","Type":"ContainerStarted","Data":"f1826326d4912f2e839a6d006ee5d4573fbb021b83ce361de515bb7c58a20bf2"} Mar 11 10:10:46 crc kubenswrapper[4830]: I0311 10:10:46.531190 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c01f3e65-5b46-4373-a420-2d966d66a081","Type":"ContainerStarted","Data":"3f33d67e51989ae8805e8411edbd382194c9967936ed32dbe2b6db2ae8a20965"} Mar 11 10:10:46 crc kubenswrapper[4830]: I0311 10:10:46.554777 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.825963482 podStartE2EDuration="4.554753718s" podCreationTimestamp="2026-03-11 10:10:42 +0000 UTC" firstStartedPulling="2026-03-11 10:10:43.672280373 +0000 UTC m=+3411.453431062" lastFinishedPulling="2026-03-11 10:10:45.401070609 +0000 UTC m=+3413.182221298" observedRunningTime="2026-03-11 10:10:46.549636801 +0000 UTC m=+3414.330787510" watchObservedRunningTime="2026-03-11 10:10:46.554753718 +0000 UTC m=+3414.335904407" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.427923 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lr6wr"] Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.432407 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.443644 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr6wr"] Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.615730 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-catalog-content\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.615809 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpz9c\" (UniqueName: \"kubernetes.io/projected/e8f42e72-1990-4fd5-b156-d4427deb7e95-kube-api-access-cpz9c\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.615913 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-utilities\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.628474 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gq5gj"] Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.631037 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.638913 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq5gj"] Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.717823 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-catalog-content\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.717900 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpz9c\" (UniqueName: \"kubernetes.io/projected/e8f42e72-1990-4fd5-b156-d4427deb7e95-kube-api-access-cpz9c\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.718086 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-utilities\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.718381 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-catalog-content\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.718552 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-utilities\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.738112 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpz9c\" (UniqueName: \"kubernetes.io/projected/e8f42e72-1990-4fd5-b156-d4427deb7e95-kube-api-access-cpz9c\") pod \"redhat-marketplace-lr6wr\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.771118 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.821051 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-catalog-content\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.821676 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-utilities\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.821998 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9lk\" (UniqueName: \"kubernetes.io/projected/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-kube-api-access-hz9lk\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.924382 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9lk\" (UniqueName: \"kubernetes.io/projected/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-kube-api-access-hz9lk\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.924502 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-catalog-content\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.924523 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-utilities\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.925141 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-utilities\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.925179 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-catalog-content\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.949579 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9lk\" (UniqueName: \"kubernetes.io/projected/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-kube-api-access-hz9lk\") pod \"redhat-operators-gq5gj\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:03 crc kubenswrapper[4830]: I0311 10:11:03.968717 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.285560 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr6wr"] Mar 11 10:11:04 crc kubenswrapper[4830]: W0311 10:11:04.298863 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f42e72_1990_4fd5_b156_d4427deb7e95.slice/crio-6539c7144cc5f2bcbb3c1829bb28f7859a44a3591d35f43ba7f6464dfe18763c WatchSource:0}: Error finding container 6539c7144cc5f2bcbb3c1829bb28f7859a44a3591d35f43ba7f6464dfe18763c: Status 404 returned error can't find the container with id 6539c7144cc5f2bcbb3c1829bb28f7859a44a3591d35f43ba7f6464dfe18763c Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.493389 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq5gj"] Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.718208 4830 generic.go:334] "Generic (PLEG): container finished" podID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerID="47cd0c1bbc45df87ed660e61b86ff49422709e04aca310244a00abf0e4b96ae7" exitCode=0 Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.718264 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5gj" event={"ID":"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2","Type":"ContainerDied","Data":"47cd0c1bbc45df87ed660e61b86ff49422709e04aca310244a00abf0e4b96ae7"} Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.718340 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5gj" event={"ID":"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2","Type":"ContainerStarted","Data":"34cc35752c5b5c87af0d5af506f2142a3048d3a22118dd4cfe1915ac363571d2"} Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.720170 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerID="bbf53188230f61fbfbe2892d5ad74e3ab956496474db08689e6f3e4aff7034ae" exitCode=0 Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.720202 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr6wr" event={"ID":"e8f42e72-1990-4fd5-b156-d4427deb7e95","Type":"ContainerDied","Data":"bbf53188230f61fbfbe2892d5ad74e3ab956496474db08689e6f3e4aff7034ae"} Mar 11 10:11:04 crc kubenswrapper[4830]: I0311 10:11:04.720226 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr6wr" event={"ID":"e8f42e72-1990-4fd5-b156-d4427deb7e95","Type":"ContainerStarted","Data":"6539c7144cc5f2bcbb3c1829bb28f7859a44a3591d35f43ba7f6464dfe18763c"} Mar 11 10:11:06 crc kubenswrapper[4830]: I0311 10:11:06.742262 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5gj" event={"ID":"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2","Type":"ContainerStarted","Data":"48f2aa0f9059bebc618a1e98daa13ab01c37b50131e3b680e205ba672593047c"} Mar 11 10:11:06 crc kubenswrapper[4830]: I0311 10:11:06.745171 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr6wr" event={"ID":"e8f42e72-1990-4fd5-b156-d4427deb7e95","Type":"ContainerStarted","Data":"c0e389e63d9439fa406cbc54009e9ef77be522955c4cc414be53cee5133efd1e"} Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.175966 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znshr/must-gather-wm52g"] Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.178813 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.181746 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-znshr"/"openshift-service-ca.crt" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.181990 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-znshr"/"kube-root-ca.crt" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.192382 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwgml\" (UniqueName: \"kubernetes.io/projected/98e5de6b-39e2-468d-a621-264332585f2f-kube-api-access-kwgml\") pod \"must-gather-wm52g\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.192438 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98e5de6b-39e2-468d-a621-264332585f2f-must-gather-output\") pod \"must-gather-wm52g\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.206415 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znshr/must-gather-wm52g"] Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.293948 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgml\" (UniqueName: \"kubernetes.io/projected/98e5de6b-39e2-468d-a621-264332585f2f-kube-api-access-kwgml\") pod \"must-gather-wm52g\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.294003 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98e5de6b-39e2-468d-a621-264332585f2f-must-gather-output\") pod \"must-gather-wm52g\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.294422 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98e5de6b-39e2-468d-a621-264332585f2f-must-gather-output\") pod \"must-gather-wm52g\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.312885 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwgml\" (UniqueName: \"kubernetes.io/projected/98e5de6b-39e2-468d-a621-264332585f2f-kube-api-access-kwgml\") pod \"must-gather-wm52g\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.504626 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.755860 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerID="c0e389e63d9439fa406cbc54009e9ef77be522955c4cc414be53cee5133efd1e" exitCode=0 Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.757824 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr6wr" event={"ID":"e8f42e72-1990-4fd5-b156-d4427deb7e95","Type":"ContainerDied","Data":"c0e389e63d9439fa406cbc54009e9ef77be522955c4cc414be53cee5133efd1e"} Mar 11 10:11:07 crc kubenswrapper[4830]: I0311 10:11:07.998274 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-znshr/must-gather-wm52g"] Mar 11 10:11:08 crc kubenswrapper[4830]: W0311 10:11:08.000642 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e5de6b_39e2_468d_a621_264332585f2f.slice/crio-121a54d6e426ec6d46e008494d060d5018ecf5c7f6504038eb6ae79ca3d8aa35 WatchSource:0}: Error finding container 121a54d6e426ec6d46e008494d060d5018ecf5c7f6504038eb6ae79ca3d8aa35: Status 404 returned error can't find the container with id 121a54d6e426ec6d46e008494d060d5018ecf5c7f6504038eb6ae79ca3d8aa35 Mar 11 10:11:08 crc kubenswrapper[4830]: I0311 10:11:08.765845 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/must-gather-wm52g" event={"ID":"98e5de6b-39e2-468d-a621-264332585f2f","Type":"ContainerStarted","Data":"121a54d6e426ec6d46e008494d060d5018ecf5c7f6504038eb6ae79ca3d8aa35"} Mar 11 10:11:09 crc kubenswrapper[4830]: I0311 10:11:09.788602 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr6wr" event={"ID":"e8f42e72-1990-4fd5-b156-d4427deb7e95","Type":"ContainerStarted","Data":"c95bdd8d7985d0685f4ef8771bdaf43108cb98831ca5e222aab3db8bbe36b166"} Mar 11 10:11:12 crc kubenswrapper[4830]: I0311 10:11:12.980411 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lr6wr" podStartSLOduration=5.801239997 podStartE2EDuration="9.980393282s" podCreationTimestamp="2026-03-11 10:11:03 +0000 UTC" firstStartedPulling="2026-03-11 10:11:04.722209469 +0000 UTC m=+3432.503360158" lastFinishedPulling="2026-03-11 10:11:08.901362764 +0000 UTC m=+3436.682513443" observedRunningTime="2026-03-11 10:11:09.81750496 +0000 UTC m=+3437.598655649" watchObservedRunningTime="2026-03-11 10:11:12.980393282 +0000 UTC m=+3440.761543971" Mar 11 10:11:13 crc kubenswrapper[4830]: I0311 10:11:13.060638 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:11:13 crc kubenswrapper[4830]: I0311 10:11:13.060718 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:11:13 crc kubenswrapper[4830]: I0311 10:11:13.772063 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:13 crc kubenswrapper[4830]: I0311 10:11:13.772409 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:13 crc kubenswrapper[4830]: I0311 10:11:13.826635 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:14 crc kubenswrapper[4830]: I0311 10:11:14.877952 4830 generic.go:334] "Generic (PLEG): container finished" podID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerID="48f2aa0f9059bebc618a1e98daa13ab01c37b50131e3b680e205ba672593047c" exitCode=0 Mar 11 10:11:14 crc kubenswrapper[4830]: I0311 10:11:14.878053 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5gj" event={"ID":"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2","Type":"ContainerDied","Data":"48f2aa0f9059bebc618a1e98daa13ab01c37b50131e3b680e205ba672593047c"} Mar 11 10:11:15 crc kubenswrapper[4830]: I0311 10:11:15.891978 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/must-gather-wm52g" event={"ID":"98e5de6b-39e2-468d-a621-264332585f2f","Type":"ContainerStarted","Data":"795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b"} Mar 11 10:11:15 crc kubenswrapper[4830]: I0311 10:11:15.892483 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/must-gather-wm52g" event={"ID":"98e5de6b-39e2-468d-a621-264332585f2f","Type":"ContainerStarted","Data":"42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c"} Mar 11 10:11:15 crc kubenswrapper[4830]: I0311 10:11:15.915861 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znshr/must-gather-wm52g" podStartSLOduration=1.534348869 podStartE2EDuration="8.915835671s" podCreationTimestamp="2026-03-11 10:11:07 +0000 UTC" firstStartedPulling="2026-03-11 10:11:08.003130775 +0000 UTC m=+3435.784281464" lastFinishedPulling="2026-03-11 10:11:15.384617577 +0000 UTC m=+3443.165768266" observedRunningTime="2026-03-11 10:11:15.910412414 +0000 UTC m=+3443.691563113" watchObservedRunningTime="2026-03-11 10:11:15.915835671 +0000 UTC m=+3443.696986380" Mar 11 10:11:16 crc kubenswrapper[4830]: I0311 10:11:16.906698 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5gj" event={"ID":"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2","Type":"ContainerStarted","Data":"2518246ca096fc674af64bd2d0fcae4bed583716b5aef898df4d00dd056c1610"} Mar 11 10:11:16 crc kubenswrapper[4830]: I0311 10:11:16.927530 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gq5gj" podStartSLOduration=2.8144797 podStartE2EDuration="13.927508019s" podCreationTimestamp="2026-03-11 10:11:03 +0000 UTC" firstStartedPulling="2026-03-11 10:11:04.722069295 +0000 UTC m=+3432.503219984" lastFinishedPulling="2026-03-11 10:11:15.835097614 +0000 UTC m=+3443.616248303" observedRunningTime="2026-03-11 10:11:16.922122232 +0000 UTC m=+3444.703272931" watchObservedRunningTime="2026-03-11 10:11:16.927508019 +0000 UTC m=+3444.708658708" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.654052 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znshr/crc-debug-nhnk9"] Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.655601 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.658854 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-znshr"/"default-dockercfg-8hwmp" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.705874 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d7b42ae-6e19-4984-81f6-44914a6065b1-host\") pod \"crc-debug-nhnk9\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.706086 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7pk\" (UniqueName: \"kubernetes.io/projected/1d7b42ae-6e19-4984-81f6-44914a6065b1-kube-api-access-bk7pk\") pod \"crc-debug-nhnk9\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.808407 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d7b42ae-6e19-4984-81f6-44914a6065b1-host\") pod \"crc-debug-nhnk9\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.808568 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d7b42ae-6e19-4984-81f6-44914a6065b1-host\") pod \"crc-debug-nhnk9\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.808599 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7pk\" (UniqueName: \"kubernetes.io/projected/1d7b42ae-6e19-4984-81f6-44914a6065b1-kube-api-access-bk7pk\") pod \"crc-debug-nhnk9\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.831988 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7pk\" (UniqueName: \"kubernetes.io/projected/1d7b42ae-6e19-4984-81f6-44914a6065b1-kube-api-access-bk7pk\") pod \"crc-debug-nhnk9\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:19 crc kubenswrapper[4830]: I0311 10:11:19.977765 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:11:20 crc kubenswrapper[4830]: W0311 10:11:20.023581 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d7b42ae_6e19_4984_81f6_44914a6065b1.slice/crio-8300391a5a0b6cded8c5f7eefb09d9cfcf111fcf803dff2b76624bafd6b4dd10 WatchSource:0}: Error finding container 8300391a5a0b6cded8c5f7eefb09d9cfcf111fcf803dff2b76624bafd6b4dd10: Status 404 returned error can't find the container with id 8300391a5a0b6cded8c5f7eefb09d9cfcf111fcf803dff2b76624bafd6b4dd10 Mar 11 10:11:20 crc kubenswrapper[4830]: I0311 10:11:20.954278 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-nhnk9" event={"ID":"1d7b42ae-6e19-4984-81f6-44914a6065b1","Type":"ContainerStarted","Data":"8300391a5a0b6cded8c5f7eefb09d9cfcf111fcf803dff2b76624bafd6b4dd10"} Mar 11 10:11:23 crc kubenswrapper[4830]: I0311 10:11:23.852367 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:23 crc kubenswrapper[4830]: I0311 10:11:23.969383 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:23 crc kubenswrapper[4830]: I0311 10:11:23.969474 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:25 crc kubenswrapper[4830]: I0311 10:11:25.045967 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gq5gj" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="registry-server" probeResult="failure" output=< Mar 11 10:11:25 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:11:25 crc kubenswrapper[4830]: > Mar 11 10:11:25 crc kubenswrapper[4830]: I0311 10:11:25.616793 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr6wr"] Mar 11 10:11:25 crc kubenswrapper[4830]: I0311 10:11:25.617074 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lr6wr" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="registry-server" containerID="cri-o://c95bdd8d7985d0685f4ef8771bdaf43108cb98831ca5e222aab3db8bbe36b166" gracePeriod=2 Mar 11 10:11:26 crc kubenswrapper[4830]: I0311 10:11:26.001849 4830 generic.go:334] "Generic (PLEG): container finished" podID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerID="c95bdd8d7985d0685f4ef8771bdaf43108cb98831ca5e222aab3db8bbe36b166" exitCode=0 Mar 11 10:11:26 crc kubenswrapper[4830]: I0311 10:11:26.001899 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr6wr" event={"ID":"e8f42e72-1990-4fd5-b156-d4427deb7e95","Type":"ContainerDied","Data":"c95bdd8d7985d0685f4ef8771bdaf43108cb98831ca5e222aab3db8bbe36b166"} Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.683431 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.792282 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-utilities\") pod \"e8f42e72-1990-4fd5-b156-d4427deb7e95\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.792977 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpz9c\" (UniqueName: \"kubernetes.io/projected/e8f42e72-1990-4fd5-b156-d4427deb7e95-kube-api-access-cpz9c\") pod \"e8f42e72-1990-4fd5-b156-d4427deb7e95\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.793239 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-utilities" (OuterVolumeSpecName: "utilities") pod "e8f42e72-1990-4fd5-b156-d4427deb7e95" (UID: "e8f42e72-1990-4fd5-b156-d4427deb7e95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.793534 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-catalog-content\") pod \"e8f42e72-1990-4fd5-b156-d4427deb7e95\" (UID: \"e8f42e72-1990-4fd5-b156-d4427deb7e95\") " Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.794625 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.801453 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f42e72-1990-4fd5-b156-d4427deb7e95-kube-api-access-cpz9c" (OuterVolumeSpecName: "kube-api-access-cpz9c") pod "e8f42e72-1990-4fd5-b156-d4427deb7e95" (UID: "e8f42e72-1990-4fd5-b156-d4427deb7e95"). InnerVolumeSpecName "kube-api-access-cpz9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.826810 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8f42e72-1990-4fd5-b156-d4427deb7e95" (UID: "e8f42e72-1990-4fd5-b156-d4427deb7e95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.896536 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f42e72-1990-4fd5-b156-d4427deb7e95-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:11:33 crc kubenswrapper[4830]: I0311 10:11:33.896576 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpz9c\" (UniqueName: \"kubernetes.io/projected/e8f42e72-1990-4fd5-b156-d4427deb7e95-kube-api-access-cpz9c\") on node \"crc\" DevicePath \"\"" Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.071008 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lr6wr" event={"ID":"e8f42e72-1990-4fd5-b156-d4427deb7e95","Type":"ContainerDied","Data":"6539c7144cc5f2bcbb3c1829bb28f7859a44a3591d35f43ba7f6464dfe18763c"} Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.071669 4830 scope.go:117] "RemoveContainer" containerID="c95bdd8d7985d0685f4ef8771bdaf43108cb98831ca5e222aab3db8bbe36b166" Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.071292 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lr6wr" Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.073092 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-nhnk9" event={"ID":"1d7b42ae-6e19-4984-81f6-44914a6065b1","Type":"ContainerStarted","Data":"a5192bb083b89497d33d0e226c7ef5764568370423780d76954d112a673849be"} Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.094775 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znshr/crc-debug-nhnk9" podStartSLOduration=1.707085803 podStartE2EDuration="15.094757808s" podCreationTimestamp="2026-03-11 10:11:19 +0000 UTC" firstStartedPulling="2026-03-11 10:11:20.027093738 +0000 UTC m=+3447.808244427" lastFinishedPulling="2026-03-11 10:11:33.414765743 +0000 UTC m=+3461.195916432" observedRunningTime="2026-03-11 10:11:34.087580663 +0000 UTC m=+3461.868731352" watchObservedRunningTime="2026-03-11 10:11:34.094757808 +0000 UTC m=+3461.875908497" Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.128001 4830 scope.go:117] "RemoveContainer" containerID="c0e389e63d9439fa406cbc54009e9ef77be522955c4cc414be53cee5133efd1e" Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.161322 4830 scope.go:117] "RemoveContainer" containerID="bbf53188230f61fbfbe2892d5ad74e3ab956496474db08689e6f3e4aff7034ae" Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.184041 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr6wr"] Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.195922 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lr6wr"] Mar 11 10:11:34 crc kubenswrapper[4830]: I0311 10:11:34.945013 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" path="/var/lib/kubelet/pods/e8f42e72-1990-4fd5-b156-d4427deb7e95/volumes" Mar 11 10:11:35 crc kubenswrapper[4830]: I0311 10:11:35.018496 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gq5gj" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="registry-server" probeResult="failure" output=< Mar 11 10:11:35 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:11:35 crc kubenswrapper[4830]: > Mar 11 10:11:43 crc kubenswrapper[4830]: I0311 10:11:43.061491 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:11:43 crc kubenswrapper[4830]: I0311 10:11:43.062442 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:11:43 crc kubenswrapper[4830]: I0311 10:11:43.062521 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 10:11:43 crc kubenswrapper[4830]: I0311 10:11:43.063769 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39b77e2518e07260183281640d393fb923092c241d46b9fda5255d21c4ba975d"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 10:11:43 crc kubenswrapper[4830]: I0311 10:11:43.063841 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://39b77e2518e07260183281640d393fb923092c241d46b9fda5255d21c4ba975d" gracePeriod=600 Mar 11 10:11:44 crc kubenswrapper[4830]: I0311 10:11:44.176702 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="39b77e2518e07260183281640d393fb923092c241d46b9fda5255d21c4ba975d" exitCode=0 Mar 11 10:11:44 crc kubenswrapper[4830]: I0311 10:11:44.176776 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"39b77e2518e07260183281640d393fb923092c241d46b9fda5255d21c4ba975d"} Mar 11 10:11:44 crc kubenswrapper[4830]: I0311 10:11:44.177367 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2"} Mar 11 10:11:44 crc kubenswrapper[4830]: I0311 10:11:44.177397 4830 scope.go:117] "RemoveContainer" containerID="2a8a4b7e9fd58e8c5802049e8099c7905c8d497c8b3d70ad4c07d95717c68d09" Mar 11 10:11:45 crc kubenswrapper[4830]: I0311 10:11:45.020259 4830 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gq5gj" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="registry-server" probeResult="failure" output=< Mar 11 10:11:45 crc kubenswrapper[4830]: timeout: failed to connect service ":50051" within 1s Mar 11 10:11:45 crc kubenswrapper[4830]: > Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.188044 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n2njz"] Mar 11 10:11:53 crc kubenswrapper[4830]: E0311 10:11:53.189057 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="registry-server" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.189076 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="registry-server" Mar 11 10:11:53 crc kubenswrapper[4830]: E0311 10:11:53.189111 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="extract-content" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.189120 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="extract-content" Mar 11 10:11:53 crc kubenswrapper[4830]: E0311 10:11:53.189152 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="extract-utilities" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.189163 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="extract-utilities" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.189400 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f42e72-1990-4fd5-b156-d4427deb7e95" containerName="registry-server" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.191091 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.205344 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2njz"] Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.356856 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-utilities\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.357051 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-catalog-content\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.357181 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpj74\" (UniqueName: \"kubernetes.io/projected/2328aff6-f748-4754-90eb-66366d3a7c6b-kube-api-access-tpj74\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.459980 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-catalog-content\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.460140 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpj74\" (UniqueName: \"kubernetes.io/projected/2328aff6-f748-4754-90eb-66366d3a7c6b-kube-api-access-tpj74\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.460196 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-utilities\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.460727 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-utilities\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.460740 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-catalog-content\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.481956 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpj74\" (UniqueName: \"kubernetes.io/projected/2328aff6-f748-4754-90eb-66366d3a7c6b-kube-api-access-tpj74\") pod \"certified-operators-n2njz\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:53 crc kubenswrapper[4830]: I0311 10:11:53.514129 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:11:54 crc kubenswrapper[4830]: I0311 10:11:54.033467 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n2njz"] Mar 11 10:11:54 crc kubenswrapper[4830]: I0311 10:11:54.054408 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:54 crc kubenswrapper[4830]: I0311 10:11:54.136288 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:54 crc kubenswrapper[4830]: I0311 10:11:54.322611 4830 generic.go:334] "Generic (PLEG): container finished" podID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerID="921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275" exitCode=0 Mar 11 10:11:54 crc kubenswrapper[4830]: I0311 10:11:54.322654 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2njz" event={"ID":"2328aff6-f748-4754-90eb-66366d3a7c6b","Type":"ContainerDied","Data":"921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275"} Mar 11 10:11:54 crc kubenswrapper[4830]: I0311 10:11:54.322691 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2njz" event={"ID":"2328aff6-f748-4754-90eb-66366d3a7c6b","Type":"ContainerStarted","Data":"77521385e86f0740537b89f0822b769180dcb23c13069c05cce4ae8604f94308"} Mar 11 10:11:56 crc kubenswrapper[4830]: I0311 10:11:56.339281 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2njz" event={"ID":"2328aff6-f748-4754-90eb-66366d3a7c6b","Type":"ContainerStarted","Data":"16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3"} Mar 11 10:11:56 crc kubenswrapper[4830]: I0311 10:11:56.370194 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq5gj"] Mar 11 10:11:56 crc kubenswrapper[4830]: I0311 10:11:56.370484 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gq5gj" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="registry-server" containerID="cri-o://2518246ca096fc674af64bd2d0fcae4bed583716b5aef898df4d00dd056c1610" gracePeriod=2 Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.351739 4830 generic.go:334] "Generic (PLEG): container finished" podID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerID="16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3" exitCode=0 Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.351914 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2njz" event={"ID":"2328aff6-f748-4754-90eb-66366d3a7c6b","Type":"ContainerDied","Data":"16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3"} Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.355322 4830 generic.go:334] "Generic (PLEG): container finished" podID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerID="2518246ca096fc674af64bd2d0fcae4bed583716b5aef898df4d00dd056c1610" exitCode=0 Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.355354 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5gj" event={"ID":"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2","Type":"ContainerDied","Data":"2518246ca096fc674af64bd2d0fcae4bed583716b5aef898df4d00dd056c1610"} Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.355376 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq5gj" event={"ID":"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2","Type":"ContainerDied","Data":"34cc35752c5b5c87af0d5af506f2142a3048d3a22118dd4cfe1915ac363571d2"} Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.355387 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34cc35752c5b5c87af0d5af506f2142a3048d3a22118dd4cfe1915ac363571d2" Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.381747 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.539624 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz9lk\" (UniqueName: \"kubernetes.io/projected/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-kube-api-access-hz9lk\") pod \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.539723 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-catalog-content\") pod \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.539997 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-utilities\") pod \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\" (UID: \"04a153bc-a0eb-4e46-aba5-b49c0bc27fe2\") " Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.540727 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-utilities" (OuterVolumeSpecName: "utilities") pod "04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" (UID: "04a153bc-a0eb-4e46-aba5-b49c0bc27fe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.549104 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-kube-api-access-hz9lk" (OuterVolumeSpecName: "kube-api-access-hz9lk") pod "04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" (UID: "04a153bc-a0eb-4e46-aba5-b49c0bc27fe2"). InnerVolumeSpecName "kube-api-access-hz9lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.642934 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz9lk\" (UniqueName: \"kubernetes.io/projected/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-kube-api-access-hz9lk\") on node \"crc\" DevicePath \"\"" Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.642986 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.699942 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" (UID: "04a153bc-a0eb-4e46-aba5-b49c0bc27fe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:11:57 crc kubenswrapper[4830]: I0311 10:11:57.744223 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:11:58 crc kubenswrapper[4830]: I0311 10:11:58.366122 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2njz" event={"ID":"2328aff6-f748-4754-90eb-66366d3a7c6b","Type":"ContainerStarted","Data":"e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de"} Mar 11 10:11:58 crc kubenswrapper[4830]: I0311 10:11:58.366146 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq5gj" Mar 11 10:11:58 crc kubenswrapper[4830]: I0311 10:11:58.385279 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n2njz" podStartSLOduration=1.716586123 podStartE2EDuration="5.385265478s" podCreationTimestamp="2026-03-11 10:11:53 +0000 UTC" firstStartedPulling="2026-03-11 10:11:54.323845969 +0000 UTC m=+3482.104996658" lastFinishedPulling="2026-03-11 10:11:57.992525324 +0000 UTC m=+3485.773676013" observedRunningTime="2026-03-11 10:11:58.385139124 +0000 UTC m=+3486.166289833" watchObservedRunningTime="2026-03-11 10:11:58.385265478 +0000 UTC m=+3486.166416167" Mar 11 10:11:58 crc kubenswrapper[4830]: I0311 10:11:58.406687 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq5gj"] Mar 11 10:11:58 crc kubenswrapper[4830]: I0311 10:11:58.415233 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gq5gj"] Mar 11 10:11:58 crc kubenswrapper[4830]: I0311 10:11:58.943794 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" path="/var/lib/kubelet/pods/04a153bc-a0eb-4e46-aba5-b49c0bc27fe2/volumes" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.147394 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553732-fdthc"] Mar 11 10:12:00 crc kubenswrapper[4830]: E0311 10:12:00.148137 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="extract-content" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.148149 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="extract-content" Mar 11 10:12:00 crc kubenswrapper[4830]: E0311 10:12:00.148193 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="registry-server" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.148200 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="registry-server" Mar 11 10:12:00 crc kubenswrapper[4830]: E0311 10:12:00.148213 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="extract-utilities" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.148220 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="extract-utilities" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.148392 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a153bc-a0eb-4e46-aba5-b49c0bc27fe2" containerName="registry-server" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.149057 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553732-fdthc" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.152189 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.152363 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.157092 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.161734 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553732-fdthc"] Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.297194 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55kl7\" (UniqueName: \"kubernetes.io/projected/70ed5e7e-173b-4140-ba24-3cdf1796f2ae-kube-api-access-55kl7\") pod \"auto-csr-approver-29553732-fdthc\" (UID: \"70ed5e7e-173b-4140-ba24-3cdf1796f2ae\") " pod="openshift-infra/auto-csr-approver-29553732-fdthc" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.399129 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55kl7\" (UniqueName: \"kubernetes.io/projected/70ed5e7e-173b-4140-ba24-3cdf1796f2ae-kube-api-access-55kl7\") pod \"auto-csr-approver-29553732-fdthc\" (UID: \"70ed5e7e-173b-4140-ba24-3cdf1796f2ae\") " pod="openshift-infra/auto-csr-approver-29553732-fdthc" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.423483 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55kl7\" (UniqueName: \"kubernetes.io/projected/70ed5e7e-173b-4140-ba24-3cdf1796f2ae-kube-api-access-55kl7\") pod \"auto-csr-approver-29553732-fdthc\" (UID: \"70ed5e7e-173b-4140-ba24-3cdf1796f2ae\") " pod="openshift-infra/auto-csr-approver-29553732-fdthc" Mar 11 10:12:00 crc kubenswrapper[4830]: I0311 10:12:00.476584 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553732-fdthc" Mar 11 10:12:01 crc kubenswrapper[4830]: I0311 10:12:01.047283 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553732-fdthc"] Mar 11 10:12:01 crc kubenswrapper[4830]: I0311 10:12:01.395224 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553732-fdthc" event={"ID":"70ed5e7e-173b-4140-ba24-3cdf1796f2ae","Type":"ContainerStarted","Data":"699790aa32dd6dc15577a9aff9dd11ca55a45408a5eaa6de079bedb6e0c562f5"} Mar 11 10:12:02 crc kubenswrapper[4830]: I0311 10:12:02.411743 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553732-fdthc" event={"ID":"70ed5e7e-173b-4140-ba24-3cdf1796f2ae","Type":"ContainerStarted","Data":"685365cb764aebcbf9e30a927f1f7276048d557f0591d1293091263a6aa32896"} Mar 11 10:12:02 crc kubenswrapper[4830]: I0311 10:12:02.427460 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29553732-fdthc" podStartSLOduration=1.49957138 podStartE2EDuration="2.427443484s" podCreationTimestamp="2026-03-11 10:12:00 +0000 UTC" firstStartedPulling="2026-03-11 10:12:01.05268151 +0000 UTC m=+3488.833832199" lastFinishedPulling="2026-03-11 10:12:01.980553614 +0000 UTC m=+3489.761704303" observedRunningTime="2026-03-11 10:12:02.424832742 +0000 UTC m=+3490.205983441" watchObservedRunningTime="2026-03-11 10:12:02.427443484 +0000 UTC m=+3490.208594173" Mar 11 10:12:03 crc kubenswrapper[4830]: I0311 10:12:03.422698 4830 generic.go:334] "Generic (PLEG): container finished" podID="70ed5e7e-173b-4140-ba24-3cdf1796f2ae" containerID="685365cb764aebcbf9e30a927f1f7276048d557f0591d1293091263a6aa32896" exitCode=0 Mar 11 10:12:03 crc kubenswrapper[4830]: I0311 10:12:03.422747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553732-fdthc" event={"ID":"70ed5e7e-173b-4140-ba24-3cdf1796f2ae","Type":"ContainerDied","Data":"685365cb764aebcbf9e30a927f1f7276048d557f0591d1293091263a6aa32896"} Mar 11 10:12:03 crc kubenswrapper[4830]: I0311 10:12:03.514254 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:12:03 crc kubenswrapper[4830]: I0311 10:12:03.514598 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:12:03 crc kubenswrapper[4830]: I0311 10:12:03.563995 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:12:04 crc kubenswrapper[4830]: I0311 10:12:04.488380 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:12:04 crc kubenswrapper[4830]: I0311 10:12:04.803318 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2njz"] Mar 11 10:12:04 crc kubenswrapper[4830]: I0311 10:12:04.835113 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553732-fdthc" Mar 11 10:12:04 crc kubenswrapper[4830]: I0311 10:12:04.886841 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55kl7\" (UniqueName: \"kubernetes.io/projected/70ed5e7e-173b-4140-ba24-3cdf1796f2ae-kube-api-access-55kl7\") pod \"70ed5e7e-173b-4140-ba24-3cdf1796f2ae\" (UID: \"70ed5e7e-173b-4140-ba24-3cdf1796f2ae\") " Mar 11 10:12:04 crc kubenswrapper[4830]: I0311 10:12:04.893123 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ed5e7e-173b-4140-ba24-3cdf1796f2ae-kube-api-access-55kl7" (OuterVolumeSpecName: "kube-api-access-55kl7") pod "70ed5e7e-173b-4140-ba24-3cdf1796f2ae" (UID: "70ed5e7e-173b-4140-ba24-3cdf1796f2ae"). InnerVolumeSpecName "kube-api-access-55kl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:12:04 crc kubenswrapper[4830]: I0311 10:12:04.989351 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55kl7\" (UniqueName: \"kubernetes.io/projected/70ed5e7e-173b-4140-ba24-3cdf1796f2ae-kube-api-access-55kl7\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:05 crc kubenswrapper[4830]: I0311 10:12:05.440918 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553732-fdthc" event={"ID":"70ed5e7e-173b-4140-ba24-3cdf1796f2ae","Type":"ContainerDied","Data":"699790aa32dd6dc15577a9aff9dd11ca55a45408a5eaa6de079bedb6e0c562f5"} Mar 11 10:12:05 crc kubenswrapper[4830]: I0311 10:12:05.441284 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="699790aa32dd6dc15577a9aff9dd11ca55a45408a5eaa6de079bedb6e0c562f5" Mar 11 10:12:05 crc kubenswrapper[4830]: I0311 10:12:05.440933 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553732-fdthc" Mar 11 10:12:05 crc kubenswrapper[4830]: I0311 10:12:05.499787 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553726-d6mz7"] Mar 11 10:12:05 crc kubenswrapper[4830]: I0311 10:12:05.506706 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553726-d6mz7"] Mar 11 10:12:06 crc kubenswrapper[4830]: I0311 10:12:06.448700 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n2njz" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="registry-server" containerID="cri-o://e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de" gracePeriod=2 Mar 11 10:12:06 crc kubenswrapper[4830]: I0311 10:12:06.945759 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ce964b2-b5cc-4ba2-b9a4-c2c600feb743" path="/var/lib/kubelet/pods/8ce964b2-b5cc-4ba2-b9a4-c2c600feb743/volumes" Mar 11 10:12:06 crc kubenswrapper[4830]: I0311 10:12:06.952300 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.032699 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-utilities\") pod \"2328aff6-f748-4754-90eb-66366d3a7c6b\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.033095 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpj74\" (UniqueName: \"kubernetes.io/projected/2328aff6-f748-4754-90eb-66366d3a7c6b-kube-api-access-tpj74\") pod \"2328aff6-f748-4754-90eb-66366d3a7c6b\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.033245 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-catalog-content\") pod \"2328aff6-f748-4754-90eb-66366d3a7c6b\" (UID: \"2328aff6-f748-4754-90eb-66366d3a7c6b\") " Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.033876 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-utilities" (OuterVolumeSpecName: "utilities") pod "2328aff6-f748-4754-90eb-66366d3a7c6b" (UID: "2328aff6-f748-4754-90eb-66366d3a7c6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.053761 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2328aff6-f748-4754-90eb-66366d3a7c6b-kube-api-access-tpj74" (OuterVolumeSpecName: "kube-api-access-tpj74") pod "2328aff6-f748-4754-90eb-66366d3a7c6b" (UID: "2328aff6-f748-4754-90eb-66366d3a7c6b"). InnerVolumeSpecName "kube-api-access-tpj74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.112272 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2328aff6-f748-4754-90eb-66366d3a7c6b" (UID: "2328aff6-f748-4754-90eb-66366d3a7c6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.135743 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.135775 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpj74\" (UniqueName: \"kubernetes.io/projected/2328aff6-f748-4754-90eb-66366d3a7c6b-kube-api-access-tpj74\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.135787 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2328aff6-f748-4754-90eb-66366d3a7c6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.460683 4830 generic.go:334] "Generic (PLEG): container finished" podID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerID="e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de" exitCode=0 Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.460724 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2njz" event={"ID":"2328aff6-f748-4754-90eb-66366d3a7c6b","Type":"ContainerDied","Data":"e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de"} Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.460739 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n2njz" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.460757 4830 scope.go:117] "RemoveContainer" containerID="e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.460747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n2njz" event={"ID":"2328aff6-f748-4754-90eb-66366d3a7c6b","Type":"ContainerDied","Data":"77521385e86f0740537b89f0822b769180dcb23c13069c05cce4ae8604f94308"} Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.480919 4830 scope.go:117] "RemoveContainer" containerID="16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.521077 4830 scope.go:117] "RemoveContainer" containerID="921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.521597 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n2njz"] Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.532520 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n2njz"] Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.577639 4830 scope.go:117] "RemoveContainer" containerID="e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de" Mar 11 10:12:07 crc kubenswrapper[4830]: E0311 10:12:07.578059 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de\": container with ID starting with e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de not found: ID does not exist" containerID="e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.578091 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de"} err="failed to get container status \"e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de\": rpc error: code = NotFound desc = could not find container \"e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de\": container with ID starting with e64ceb903dad9fa6a241f02f28e6addac38553683ca8ea583662cb05fac1f0de not found: ID does not exist" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.578114 4830 scope.go:117] "RemoveContainer" containerID="16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3" Mar 11 10:12:07 crc kubenswrapper[4830]: E0311 10:12:07.578366 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3\": container with ID starting with 16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3 not found: ID does not exist" containerID="16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.578392 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3"} err="failed to get container status \"16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3\": rpc error: code = NotFound desc = could not find container \"16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3\": container with ID starting with 16bf8ba58717943ef60f2ce71eb6a178bff82ab38eaf979649fea46b68e63bf3 not found: ID does not exist" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.578403 4830 scope.go:117] "RemoveContainer" containerID="921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275" Mar 11 10:12:07 crc kubenswrapper[4830]: E0311 10:12:07.578637 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275\": container with ID starting with 921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275 not found: ID does not exist" containerID="921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275" Mar 11 10:12:07 crc kubenswrapper[4830]: I0311 10:12:07.578655 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275"} err="failed to get container status \"921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275\": rpc error: code = NotFound desc = could not find container \"921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275\": container with ID starting with 921962a03760e23e41f484321f51b77e1a57db786a3d7e97d2e40e1f41064275 not found: ID does not exist" Mar 11 10:12:08 crc kubenswrapper[4830]: I0311 10:12:08.943367 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" path="/var/lib/kubelet/pods/2328aff6-f748-4754-90eb-66366d3a7c6b/volumes" Mar 11 10:12:13 crc kubenswrapper[4830]: I0311 10:12:13.515656 4830 generic.go:334] "Generic (PLEG): container finished" podID="1d7b42ae-6e19-4984-81f6-44914a6065b1" containerID="a5192bb083b89497d33d0e226c7ef5764568370423780d76954d112a673849be" exitCode=0 Mar 11 10:12:13 crc kubenswrapper[4830]: I0311 10:12:13.515773 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-nhnk9" event={"ID":"1d7b42ae-6e19-4984-81f6-44914a6065b1","Type":"ContainerDied","Data":"a5192bb083b89497d33d0e226c7ef5764568370423780d76954d112a673849be"} Mar 11 10:12:13 crc kubenswrapper[4830]: I0311 10:12:13.573101 4830 scope.go:117] "RemoveContainer" containerID="ab2a7aa6993aa7bc747d02ca521d17a55fee5bc963aa7891814bcd687f34c96a" Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.621407 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.677269 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znshr/crc-debug-nhnk9"] Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.682205 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk7pk\" (UniqueName: \"kubernetes.io/projected/1d7b42ae-6e19-4984-81f6-44914a6065b1-kube-api-access-bk7pk\") pod \"1d7b42ae-6e19-4984-81f6-44914a6065b1\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.682309 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d7b42ae-6e19-4984-81f6-44914a6065b1-host\") pod \"1d7b42ae-6e19-4984-81f6-44914a6065b1\" (UID: \"1d7b42ae-6e19-4984-81f6-44914a6065b1\") " Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.683097 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d7b42ae-6e19-4984-81f6-44914a6065b1-host" (OuterVolumeSpecName: "host") pod "1d7b42ae-6e19-4984-81f6-44914a6065b1" (UID: "1d7b42ae-6e19-4984-81f6-44914a6065b1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.685101 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znshr/crc-debug-nhnk9"] Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.690253 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d7b42ae-6e19-4984-81f6-44914a6065b1-kube-api-access-bk7pk" (OuterVolumeSpecName: "kube-api-access-bk7pk") pod "1d7b42ae-6e19-4984-81f6-44914a6065b1" (UID: "1d7b42ae-6e19-4984-81f6-44914a6065b1"). InnerVolumeSpecName "kube-api-access-bk7pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.784429 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk7pk\" (UniqueName: \"kubernetes.io/projected/1d7b42ae-6e19-4984-81f6-44914a6065b1-kube-api-access-bk7pk\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.784473 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d7b42ae-6e19-4984-81f6-44914a6065b1-host\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:14 crc kubenswrapper[4830]: I0311 10:12:14.943774 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d7b42ae-6e19-4984-81f6-44914a6065b1" path="/var/lib/kubelet/pods/1d7b42ae-6e19-4984-81f6-44914a6065b1/volumes" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.535351 4830 scope.go:117] "RemoveContainer" containerID="a5192bb083b89497d33d0e226c7ef5764568370423780d76954d112a673849be" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.535458 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-nhnk9" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.849942 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znshr/crc-debug-xjdx2"] Mar 11 10:12:15 crc kubenswrapper[4830]: E0311 10:12:15.850363 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="registry-server" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850377 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="registry-server" Mar 11 10:12:15 crc kubenswrapper[4830]: E0311 10:12:15.850396 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ed5e7e-173b-4140-ba24-3cdf1796f2ae" containerName="oc" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850402 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ed5e7e-173b-4140-ba24-3cdf1796f2ae" containerName="oc" Mar 11 10:12:15 crc kubenswrapper[4830]: E0311 10:12:15.850414 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d7b42ae-6e19-4984-81f6-44914a6065b1" containerName="container-00" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850420 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d7b42ae-6e19-4984-81f6-44914a6065b1" containerName="container-00" Mar 11 10:12:15 crc kubenswrapper[4830]: E0311 10:12:15.850444 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="extract-utilities" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850450 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="extract-utilities" Mar 11 10:12:15 crc kubenswrapper[4830]: E0311 10:12:15.850458 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="extract-content" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850465 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="extract-content" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850666 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ed5e7e-173b-4140-ba24-3cdf1796f2ae" containerName="oc" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850682 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2328aff6-f748-4754-90eb-66366d3a7c6b" containerName="registry-server" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.850694 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d7b42ae-6e19-4984-81f6-44914a6065b1" containerName="container-00" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.851362 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.853231 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-znshr"/"default-dockercfg-8hwmp" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.902812 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-host\") pod \"crc-debug-xjdx2\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:15 crc kubenswrapper[4830]: I0311 10:12:15.903233 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkf46\" (UniqueName: \"kubernetes.io/projected/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-kube-api-access-dkf46\") pod \"crc-debug-xjdx2\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.005057 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkf46\" (UniqueName: \"kubernetes.io/projected/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-kube-api-access-dkf46\") pod \"crc-debug-xjdx2\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.005217 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-host\") pod \"crc-debug-xjdx2\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.005406 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-host\") pod \"crc-debug-xjdx2\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.021765 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkf46\" (UniqueName: \"kubernetes.io/projected/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-kube-api-access-dkf46\") pod \"crc-debug-xjdx2\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.168415 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.545574 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-xjdx2" event={"ID":"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72","Type":"ContainerStarted","Data":"13af7b8a2271f40de79b382d91e85a30ed0a0cdc3bb95c8a316ec26588d40008"} Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.545952 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-xjdx2" event={"ID":"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72","Type":"ContainerStarted","Data":"0557aabb68d7785cbd1243a924aa94055f51f8e6c6b1e3c3c7708f2cfd466a90"} Mar 11 10:12:16 crc kubenswrapper[4830]: I0311 10:12:16.564859 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-znshr/crc-debug-xjdx2" podStartSLOduration=1.564840803 podStartE2EDuration="1.564840803s" podCreationTimestamp="2026-03-11 10:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 10:12:16.559703483 +0000 UTC m=+3504.340854182" watchObservedRunningTime="2026-03-11 10:12:16.564840803 +0000 UTC m=+3504.345991492" Mar 11 10:12:17 crc kubenswrapper[4830]: I0311 10:12:17.557865 4830 generic.go:334] "Generic (PLEG): container finished" podID="2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72" containerID="13af7b8a2271f40de79b382d91e85a30ed0a0cdc3bb95c8a316ec26588d40008" exitCode=0 Mar 11 10:12:17 crc kubenswrapper[4830]: I0311 10:12:17.558012 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-xjdx2" event={"ID":"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72","Type":"ContainerDied","Data":"13af7b8a2271f40de79b382d91e85a30ed0a0cdc3bb95c8a316ec26588d40008"} Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.661655 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.713550 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znshr/crc-debug-xjdx2"] Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.721715 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znshr/crc-debug-xjdx2"] Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.754151 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-host\") pod \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.754253 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkf46\" (UniqueName: \"kubernetes.io/projected/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-kube-api-access-dkf46\") pod \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\" (UID: \"2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72\") " Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.754296 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-host" (OuterVolumeSpecName: "host") pod "2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72" (UID: "2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.754735 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-host\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.760450 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-kube-api-access-dkf46" (OuterVolumeSpecName: "kube-api-access-dkf46") pod "2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72" (UID: "2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72"). InnerVolumeSpecName "kube-api-access-dkf46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.856921 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkf46\" (UniqueName: \"kubernetes.io/projected/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72-kube-api-access-dkf46\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:18 crc kubenswrapper[4830]: I0311 10:12:18.942926 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72" path="/var/lib/kubelet/pods/2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72/volumes" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.577887 4830 scope.go:117] "RemoveContainer" containerID="13af7b8a2271f40de79b382d91e85a30ed0a0cdc3bb95c8a316ec26588d40008" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.577933 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xjdx2" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.844728 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-znshr/crc-debug-xhbjn"] Mar 11 10:12:19 crc kubenswrapper[4830]: E0311 10:12:19.845375 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72" containerName="container-00" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.845394 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72" containerName="container-00" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.845636 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8f6ae4-9f90-4bc2-9d48-0e3d1089ae72" containerName="container-00" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.846401 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.848566 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-znshr"/"default-dockercfg-8hwmp" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.874679 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c219392-6984-409d-b05c-76faaf866f4b-host\") pod \"crc-debug-xhbjn\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.874733 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4z4\" (UniqueName: \"kubernetes.io/projected/6c219392-6984-409d-b05c-76faaf866f4b-kube-api-access-8w4z4\") pod \"crc-debug-xhbjn\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.976050 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c219392-6984-409d-b05c-76faaf866f4b-host\") pod \"crc-debug-xhbjn\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.976132 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4z4\" (UniqueName: \"kubernetes.io/projected/6c219392-6984-409d-b05c-76faaf866f4b-kube-api-access-8w4z4\") pod \"crc-debug-xhbjn\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.976487 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c219392-6984-409d-b05c-76faaf866f4b-host\") pod \"crc-debug-xhbjn\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:19 crc kubenswrapper[4830]: I0311 10:12:19.994016 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4z4\" (UniqueName: \"kubernetes.io/projected/6c219392-6984-409d-b05c-76faaf866f4b-kube-api-access-8w4z4\") pod \"crc-debug-xhbjn\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:20 crc kubenswrapper[4830]: I0311 10:12:20.162634 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:20 crc kubenswrapper[4830]: W0311 10:12:20.186102 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c219392_6984_409d_b05c_76faaf866f4b.slice/crio-007dfe74b41258eb29bbac662cb6b9c02e1a4f77ddba15fb0ece358c6e66d509 WatchSource:0}: Error finding container 007dfe74b41258eb29bbac662cb6b9c02e1a4f77ddba15fb0ece358c6e66d509: Status 404 returned error can't find the container with id 007dfe74b41258eb29bbac662cb6b9c02e1a4f77ddba15fb0ece358c6e66d509 Mar 11 10:12:20 crc kubenswrapper[4830]: I0311 10:12:20.588570 4830 generic.go:334] "Generic (PLEG): container finished" podID="6c219392-6984-409d-b05c-76faaf866f4b" containerID="5e105db437bb90d14eb468ebed213015ff0a1d5e51061000b857b338a27977d9" exitCode=0 Mar 11 10:12:20 crc kubenswrapper[4830]: I0311 10:12:20.588687 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-xhbjn" event={"ID":"6c219392-6984-409d-b05c-76faaf866f4b","Type":"ContainerDied","Data":"5e105db437bb90d14eb468ebed213015ff0a1d5e51061000b857b338a27977d9"} Mar 11 10:12:20 crc kubenswrapper[4830]: I0311 10:12:20.588895 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/crc-debug-xhbjn" event={"ID":"6c219392-6984-409d-b05c-76faaf866f4b","Type":"ContainerStarted","Data":"007dfe74b41258eb29bbac662cb6b9c02e1a4f77ddba15fb0ece358c6e66d509"} Mar 11 10:12:20 crc kubenswrapper[4830]: I0311 10:12:20.622973 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znshr/crc-debug-xhbjn"] Mar 11 10:12:20 crc kubenswrapper[4830]: I0311 10:12:20.632808 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znshr/crc-debug-xhbjn"] Mar 11 10:12:21 crc kubenswrapper[4830]: I0311 10:12:21.714102 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:21 crc kubenswrapper[4830]: I0311 10:12:21.808980 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c219392-6984-409d-b05c-76faaf866f4b-host\") pod \"6c219392-6984-409d-b05c-76faaf866f4b\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " Mar 11 10:12:21 crc kubenswrapper[4830]: I0311 10:12:21.809100 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c219392-6984-409d-b05c-76faaf866f4b-host" (OuterVolumeSpecName: "host") pod "6c219392-6984-409d-b05c-76faaf866f4b" (UID: "6c219392-6984-409d-b05c-76faaf866f4b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 10:12:21 crc kubenswrapper[4830]: I0311 10:12:21.809148 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w4z4\" (UniqueName: \"kubernetes.io/projected/6c219392-6984-409d-b05c-76faaf866f4b-kube-api-access-8w4z4\") pod \"6c219392-6984-409d-b05c-76faaf866f4b\" (UID: \"6c219392-6984-409d-b05c-76faaf866f4b\") " Mar 11 10:12:21 crc kubenswrapper[4830]: I0311 10:12:21.809656 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6c219392-6984-409d-b05c-76faaf866f4b-host\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:21 crc kubenswrapper[4830]: I0311 10:12:21.815210 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c219392-6984-409d-b05c-76faaf866f4b-kube-api-access-8w4z4" (OuterVolumeSpecName: "kube-api-access-8w4z4") pod "6c219392-6984-409d-b05c-76faaf866f4b" (UID: "6c219392-6984-409d-b05c-76faaf866f4b"). InnerVolumeSpecName "kube-api-access-8w4z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:12:21 crc kubenswrapper[4830]: I0311 10:12:21.911154 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w4z4\" (UniqueName: \"kubernetes.io/projected/6c219392-6984-409d-b05c-76faaf866f4b-kube-api-access-8w4z4\") on node \"crc\" DevicePath \"\"" Mar 11 10:12:22 crc kubenswrapper[4830]: I0311 10:12:22.608218 4830 scope.go:117] "RemoveContainer" containerID="5e105db437bb90d14eb468ebed213015ff0a1d5e51061000b857b338a27977d9" Mar 11 10:12:22 crc kubenswrapper[4830]: I0311 10:12:22.608261 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/crc-debug-xhbjn" Mar 11 10:12:22 crc kubenswrapper[4830]: I0311 10:12:22.944115 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c219392-6984-409d-b05c-76faaf866f4b" path="/var/lib/kubelet/pods/6c219392-6984-409d-b05c-76faaf866f4b/volumes" Mar 11 10:12:36 crc kubenswrapper[4830]: I0311 10:12:36.871616 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7888bcc99b-t8slf_bbe664eb-daf0-4aeb-ae09-f47b2204bdf1/barbican-api/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.043256 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7888bcc99b-t8slf_bbe664eb-daf0-4aeb-ae09-f47b2204bdf1/barbican-api-log/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.092280 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c976ddb9d-ppssd_98ddc718-e67e-406f-aae3-03680232691b/barbican-keystone-listener/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.113403 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c976ddb9d-ppssd_98ddc718-e67e-406f-aae3-03680232691b/barbican-keystone-listener-log/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.273336 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bddfb9bc9-6hzsp_b573b144-d9a4-4ea5-8b28-d9e4e3ed6274/barbican-worker/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.291715 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bddfb9bc9-6hzsp_b573b144-d9a4-4ea5-8b28-d9e4e3ed6274/barbican-worker-log/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.537707 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/ceilometer-central-agent/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.547980 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d_751133b2-5530-48d1-9cb0-4e69aadf979a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.567270 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/ceilometer-notification-agent/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.703827 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/sg-core/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.722483 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/proxy-httpd/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.817478 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f96f1f82-873d-4665-8273-65bfc41ba374/cinder-api/0.log" Mar 11 10:12:37 crc kubenswrapper[4830]: I0311 10:12:37.908539 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f96f1f82-873d-4665-8273-65bfc41ba374/cinder-api-log/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.007127 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_65554cb9-6d98-4e70-8feb-73029d8184dc/cinder-scheduler/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.061514 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_65554cb9-6d98-4e70-8feb-73029d8184dc/probe/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.205458 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx_9711332d-adac-4289-81e4-686135601f68/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.297274 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2_aacf9f52-24a2-462c-8957-3fb5c88988d3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.400388 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-cms8z_e6b1a549-c16a-4efe-83df-800de8dbdac2/init/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.576918 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-cms8z_e6b1a549-c16a-4efe-83df-800de8dbdac2/init/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.729120 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-cms8z_e6b1a549-c16a-4efe-83df-800de8dbdac2/dnsmasq-dns/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.735107 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5_fae734f9-b26d-4252-b943-b09b3e235cfa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.892481 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c8947c5-6c54-4acb-9100-3c5ea0988770/glance-httpd/0.log" Mar 11 10:12:38 crc kubenswrapper[4830]: I0311 10:12:38.917313 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c8947c5-6c54-4acb-9100-3c5ea0988770/glance-log/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.060127 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6069e67-6f76-4a02-9c90-d1ac74d8aaca/glance-httpd/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.064145 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6069e67-6f76-4a02-9c90-d1ac74d8aaca/glance-log/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.258584 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-789dc4b6cd-xz7ds_77e86c78-b565-4e6c-8867-519fa2d5137a/horizon/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.448583 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-672vg_13dfb6c4-9546-4a13-bc42-842a71c96c6c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.515728 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-789dc4b6cd-xz7ds_77e86c78-b565-4e6c-8867-519fa2d5137a/horizon-log/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.566633 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jx2k4_5132dffb-d28b-494f-891d-ea13b54a5a72/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.811795 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29553721-rtbwn_04ab5666-8c5d-4e96-9c47-502bdc63bafb/keystone-cron/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.863213 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-559977bfdc-r7ssx_3d8403ac-71e1-41f2-a897-bf61055308f6/keystone-api/0.log" Mar 11 10:12:39 crc kubenswrapper[4830]: I0311 10:12:39.996546 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e85b49bc-4607-4852-9fce-dcf43af1069f/kube-state-metrics/0.log" Mar 11 10:12:40 crc kubenswrapper[4830]: I0311 10:12:40.110859 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq_bfdc6f64-813a-4a57-a123-b4d15c6ae569/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:40 crc kubenswrapper[4830]: I0311 10:12:40.580164 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c74bff69-478cc_55875815-4467-4c5e-8401-b220cb1694c6/neutron-httpd/0.log" Mar 11 10:12:40 crc kubenswrapper[4830]: I0311 10:12:40.649394 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c74bff69-478cc_55875815-4467-4c5e-8401-b220cb1694c6/neutron-api/0.log" Mar 11 10:12:40 crc kubenswrapper[4830]: I0311 10:12:40.802731 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85_48be5fba-f61d-4475-bbaf-df6ece9da972/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:41 crc kubenswrapper[4830]: I0311 10:12:41.313193 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_efc74c44-bbbb-4dd9-b762-f7c483d0e336/nova-cell0-conductor-conductor/0.log" Mar 11 10:12:41 crc kubenswrapper[4830]: I0311 10:12:41.366492 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d3eb0127-a012-4cbf-8768-84e20518f316/nova-api-log/0.log" Mar 11 10:12:41 crc kubenswrapper[4830]: I0311 10:12:41.468524 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d3eb0127-a012-4cbf-8768-84e20518f316/nova-api-api/0.log" Mar 11 10:12:41 crc kubenswrapper[4830]: I0311 10:12:41.694382 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f690df7f-ca68-4a5d-8e9e-4d7d55df4773/nova-cell1-novncproxy-novncproxy/0.log" Mar 11 10:12:41 crc kubenswrapper[4830]: I0311 10:12:41.717584 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e6bc7c45-10c9-4571-923a-4fb4b861657e/nova-cell1-conductor-conductor/0.log" Mar 11 10:12:41 crc kubenswrapper[4830]: I0311 10:12:41.874802 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-79mrm_df44fa5f-956c-47f8-af60-49a95e1c6da1/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:42 crc kubenswrapper[4830]: I0311 10:12:42.019597 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_72993026-e5ee-42ee-9381-36ec25d1d1d0/nova-metadata-log/0.log" Mar 11 10:12:42 crc kubenswrapper[4830]: I0311 10:12:42.362316 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c53770ee-2b8a-4e7a-a59c-bc09739ce4e5/nova-scheduler-scheduler/0.log" Mar 11 10:12:42 crc kubenswrapper[4830]: I0311 10:12:42.384554 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ce0c7bf-830f-40f3-850f-19b0a879ba23/mysql-bootstrap/0.log" Mar 11 10:12:42 crc kubenswrapper[4830]: I0311 10:12:42.608294 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ce0c7bf-830f-40f3-850f-19b0a879ba23/galera/0.log" Mar 11 10:12:42 crc kubenswrapper[4830]: I0311 10:12:42.644654 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ce0c7bf-830f-40f3-850f-19b0a879ba23/mysql-bootstrap/0.log" Mar 11 10:12:42 crc kubenswrapper[4830]: I0311 10:12:42.827325 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ad5f765-f3dd-42f3-9829-2323ea982c58/mysql-bootstrap/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.039301 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ad5f765-f3dd-42f3-9829-2323ea982c58/mysql-bootstrap/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.091812 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ad5f765-f3dd-42f3-9829-2323ea982c58/galera/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.162085 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_72993026-e5ee-42ee-9381-36ec25d1d1d0/nova-metadata-metadata/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.212029 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c113279e-3264-4a62-8c50-5ddb2be700bb/openstackclient/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.343721 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dp7ql_1a868478-8050-4c0f-a7f4-d6dcc82f9832/openstack-network-exporter/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.481867 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovsdb-server-init/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.657250 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovsdb-server/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.669530 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovs-vswitchd/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.670078 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovsdb-server-init/0.log" Mar 11 10:12:43 crc kubenswrapper[4830]: I0311 10:12:43.865361 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xjsks_d97948cc-fc42-46c8-b46e-3f8efdc251db/ovn-controller/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.018357 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kqw9x_ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.134678 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0/openstack-network-exporter/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.238781 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0/ovn-northd/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.264096 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a1424a67-6a71-4943-b855-4795d2427214/openstack-network-exporter/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.318243 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a1424a67-6a71-4943-b855-4795d2427214/ovsdbserver-nb/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.481839 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8f9d7ab5-467c-4888-8759-6e2ef59957e5/openstack-network-exporter/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.506478 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8f9d7ab5-467c-4888-8759-6e2ef59957e5/ovsdbserver-sb/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.788848 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599b4448-86g7s_da8c023e-1cf1-4a06-8c20-2b79612f7ae8/placement-api/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.818208 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599b4448-86g7s_da8c023e-1cf1-4a06-8c20-2b79612f7ae8/placement-log/0.log" Mar 11 10:12:44 crc kubenswrapper[4830]: I0311 10:12:44.870304 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5da20462-be2b-466c-9c04-17b6a0a94572/setup-container/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.049036 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5da20462-be2b-466c-9c04-17b6a0a94572/setup-container/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.071422 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5da20462-be2b-466c-9c04-17b6a0a94572/rabbitmq/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.079962 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e0f47113-88e8-4b57-b9df-1ff8b05cde01/setup-container/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.342976 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e0f47113-88e8-4b57-b9df-1ff8b05cde01/rabbitmq/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.381450 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e0f47113-88e8-4b57-b9df-1ff8b05cde01/setup-container/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.404278 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56_592c8d08-ac0e-4665-9d65-e362412b7867/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.621782 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-mrqxc_2b0b1934-6dd3-441c-923d-67b9ed28a177/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.680731 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs_d57e6a98-80e8-40a0-af5d-56d936e6ab67/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.843197 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xf8b7_9ae7bc18-6614-4094-961f-9590aa0346f4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:45 crc kubenswrapper[4830]: I0311 10:12:45.961173 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-swd9t_21e999cb-ceca-44f0-a7e8-cf0d801e84a7/ssh-known-hosts-edpm-deployment/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.283625 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f56859c77-bnqc2_fcbbcfad-16c7-4040-9b06-b2ff9f4c5666/proxy-server/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.389338 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f56859c77-bnqc2_fcbbcfad-16c7-4040-9b06-b2ff9f4c5666/proxy-httpd/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.637425 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jmd8g_9e52669c-56df-4791-84e7-4d4bd34e420f/swift-ring-rebalance/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.704433 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-auditor/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.709347 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-reaper/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.856214 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-server/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.898356 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-replicator/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.965558 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-replicator/0.log" Mar 11 10:12:46 crc kubenswrapper[4830]: I0311 10:12:46.998846 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-auditor/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.051490 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-server/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.120349 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-updater/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.274604 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-auditor/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.283416 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-expirer/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.313977 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-replicator/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.392821 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-server/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.508580 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/rsync/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.527718 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-updater/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.578838 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/swift-recon-cron/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.813094 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3/tempest-tests-tempest-tests-runner/0.log" Mar 11 10:12:47 crc kubenswrapper[4830]: I0311 10:12:47.842702 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4_d6daac1f-f36f-42a1-9735-1b182e03052e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:48 crc kubenswrapper[4830]: I0311 10:12:48.099438 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8_4d5247a6-36f4-4260-88bd-659f66f5efc0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:12:48 crc kubenswrapper[4830]: I0311 10:12:48.174965 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c01f3e65-5b46-4373-a420-2d966d66a081/test-operator-logs-container/0.log" Mar 11 10:12:56 crc kubenswrapper[4830]: I0311 10:12:56.856765 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_60740879-ec5c-4d1f-bfd0-68ec5e8960f2/memcached/0.log" Mar 11 10:13:11 crc kubenswrapper[4830]: I0311 10:13:11.677823 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/util/0.log" Mar 11 10:13:11 crc kubenswrapper[4830]: I0311 10:13:11.851375 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/util/0.log" Mar 11 10:13:11 crc kubenswrapper[4830]: I0311 10:13:11.881511 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/pull/0.log" Mar 11 10:13:11 crc kubenswrapper[4830]: I0311 10:13:11.956210 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/pull/0.log" Mar 11 10:13:12 crc kubenswrapper[4830]: I0311 10:13:12.095001 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/pull/0.log" Mar 11 10:13:12 crc kubenswrapper[4830]: I0311 10:13:12.101065 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/util/0.log" Mar 11 10:13:12 crc kubenswrapper[4830]: I0311 10:13:12.137134 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/extract/0.log" Mar 11 10:13:12 crc kubenswrapper[4830]: I0311 10:13:12.540350 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-n5khc_16121653-f66c-441b-b1e2-8cd3c1e558e4/manager/0.log" Mar 11 10:13:12 crc kubenswrapper[4830]: I0311 10:13:12.892471 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-7kzdv_d7442149-a02a-401b-b3bd-c1d470af5b3b/manager/0.log" Mar 11 10:13:13 crc kubenswrapper[4830]: I0311 10:13:13.003868 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-xg456_3112f394-9b8e-43c2-9707-94ac1a2778db/manager/0.log" Mar 11 10:13:13 crc kubenswrapper[4830]: I0311 10:13:13.197898 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-7cg7m_283b8bf7-046d-4600-be30-f578a6ec3c4d/manager/0.log" Mar 11 10:13:13 crc kubenswrapper[4830]: I0311 10:13:13.753470 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-mtd4g_fe16642f-b4c0-45e6-b222-83fcc2c3fb5c/manager/0.log" Mar 11 10:13:13 crc kubenswrapper[4830]: I0311 10:13:13.800395 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-rjsvm_ceffcca8-5182-4f52-b359-e20664c1d527/manager/0.log" Mar 11 10:13:13 crc kubenswrapper[4830]: I0311 10:13:13.844654 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-ntcnm_74ba9d62-2d47-46a5-bd26-1a81bb0a8484/manager/0.log" Mar 11 10:13:14 crc kubenswrapper[4830]: I0311 10:13:14.063283 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-bcbp5_1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f/manager/0.log" Mar 11 10:13:14 crc kubenswrapper[4830]: I0311 10:13:14.209169 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-9pvn7_2a85b060-3965-4d51-b568-2b360fee4c44/manager/0.log" Mar 11 10:13:14 crc kubenswrapper[4830]: I0311 10:13:14.401921 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-q98j9_c29c2a15-0eb3-41aa-b0b9-710a5ed56a87/manager/0.log" Mar 11 10:13:14 crc kubenswrapper[4830]: I0311 10:13:14.601993 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-5gkz6_6ee90085-fc25-4491-a2fc-9b45d5d8207a/manager/0.log" Mar 11 10:13:14 crc kubenswrapper[4830]: I0311 10:13:14.820727 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-c5vgk_2525ca9b-eb81-4fce-86b4-a767db795de6/manager/0.log" Mar 11 10:13:14 crc kubenswrapper[4830]: I0311 10:13:14.880511 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-kwh58_5602b15d-928b-4138-a7f0-66f8e8d037b8/manager/0.log" Mar 11 10:13:15 crc kubenswrapper[4830]: I0311 10:13:15.108093 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h_94d241ed-64bc-4152-b445-51ae5a61bb95/manager/0.log" Mar 11 10:13:15 crc kubenswrapper[4830]: I0311 10:13:15.489502 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-67d889964b-xg4rh_843e8d8e-4cb5-4260-af55-147e416c0791/operator/0.log" Mar 11 10:13:15 crc kubenswrapper[4830]: I0311 10:13:15.736412 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6wtvn_fdd7a557-69d6-4baf-89e5-a8bf6219aaa0/registry-server/0.log" Mar 11 10:13:16 crc kubenswrapper[4830]: I0311 10:13:16.039786 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-xmctv_f6f6b27c-94d8-456d-8d41-19c905065e1d/manager/0.log" Mar 11 10:13:16 crc kubenswrapper[4830]: I0311 10:13:16.249531 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-57m9k_dedc5b41-d549-4015-b010-bc07cea3d318/manager/0.log" Mar 11 10:13:16 crc kubenswrapper[4830]: I0311 10:13:16.341602 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-nkqm4_14e5e0c3-1203-4a07-93bd-94578a7f0cb2/operator/0.log" Mar 11 10:13:16 crc kubenswrapper[4830]: I0311 10:13:16.555850 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-6rnp4_15d414b6-a515-4db4-b60c-a2b34004ea9c/manager/0.log" Mar 11 10:13:16 crc kubenswrapper[4830]: I0311 10:13:16.801739 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-jkths_c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0/manager/0.log" Mar 11 10:13:16 crc kubenswrapper[4830]: I0311 10:13:16.838756 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-jhcz9_f488d4b3-55b5-424e-b00e-0bd262fc5f4f/manager/0.log" Mar 11 10:13:17 crc kubenswrapper[4830]: I0311 10:13:17.023547 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-vjktt_733a981c-36a1-442b-8e24-16a7498efc54/manager/0.log" Mar 11 10:13:17 crc kubenswrapper[4830]: I0311 10:13:17.079562 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fcc5fcbf7-mw66h_f24d67a9-4996-4315-8c38-fa4ef58e0a52/manager/0.log" Mar 11 10:13:19 crc kubenswrapper[4830]: I0311 10:13:19.109092 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-cxpww_d72c70bc-5f58-4c0f-a584-f352adf175e7/manager/0.log" Mar 11 10:13:35 crc kubenswrapper[4830]: I0311 10:13:35.587532 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tqscv_cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c/control-plane-machine-set-operator/0.log" Mar 11 10:13:35 crc kubenswrapper[4830]: I0311 10:13:35.753044 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s8cnh_fd48e28d-bcdd-4bba-a540-0213cda9599a/machine-api-operator/0.log" Mar 11 10:13:35 crc kubenswrapper[4830]: I0311 10:13:35.760171 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s8cnh_fd48e28d-bcdd-4bba-a540-0213cda9599a/kube-rbac-proxy/0.log" Mar 11 10:13:43 crc kubenswrapper[4830]: I0311 10:13:43.059985 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:13:43 crc kubenswrapper[4830]: I0311 10:13:43.060401 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:13:47 crc kubenswrapper[4830]: I0311 10:13:47.835727 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-s7blt_1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d/cert-manager-controller/0.log" Mar 11 10:13:48 crc kubenswrapper[4830]: I0311 10:13:48.020246 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-9vzgv_c952f67d-03e4-4c30-a44b-884f26d81c4e/cert-manager-cainjector/0.log" Mar 11 10:13:48 crc kubenswrapper[4830]: I0311 10:13:48.056615 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-xlw9r_8278ba6d-7719-4b12-9f80-29867e6fc2ba/cert-manager-webhook/0.log" Mar 11 10:13:59 crc kubenswrapper[4830]: I0311 10:13:59.679739 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-ztp77_7bed2ffb-6685-4495-badf-1c70ea17d8fa/nmstate-console-plugin/0.log" Mar 11 10:13:59 crc kubenswrapper[4830]: I0311 10:13:59.943310 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6gnfd_3dec7623-b8d0-4aa6-9a7f-0796475bcaaf/nmstate-handler/0.log" Mar 11 10:13:59 crc kubenswrapper[4830]: I0311 10:13:59.950272 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qgmxm_20718750-ed46-4785-b2ca-0e41dfd093be/kube-rbac-proxy/0.log" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.118825 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zqpc6_b282dd08-59c0-4a26-a7a0-e165dfc899b6/nmstate-operator/0.log" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.122516 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qgmxm_20718750-ed46-4785-b2ca-0e41dfd093be/nmstate-metrics/0.log" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.144335 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553734-qdrg7"] Mar 11 10:14:00 crc kubenswrapper[4830]: E0311 10:14:00.144705 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c219392-6984-409d-b05c-76faaf866f4b" containerName="container-00" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.144720 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c219392-6984-409d-b05c-76faaf866f4b" containerName="container-00" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.144912 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c219392-6984-409d-b05c-76faaf866f4b" containerName="container-00" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.145533 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553734-qdrg7" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.147942 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.148114 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.148911 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.159106 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553734-qdrg7"] Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.205571 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/c2d6322f-95db-4f57-bbc6-1dfd2a023536-kube-api-access-qjqh5\") pod \"auto-csr-approver-29553734-qdrg7\" (UID: \"c2d6322f-95db-4f57-bbc6-1dfd2a023536\") " pod="openshift-infra/auto-csr-approver-29553734-qdrg7" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.307529 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/c2d6322f-95db-4f57-bbc6-1dfd2a023536-kube-api-access-qjqh5\") pod \"auto-csr-approver-29553734-qdrg7\" (UID: \"c2d6322f-95db-4f57-bbc6-1dfd2a023536\") " pod="openshift-infra/auto-csr-approver-29553734-qdrg7" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.327256 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-vzfnr_9e81d681-fa0d-4789-8762-ee953dc9f5aa/nmstate-webhook/0.log" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.327571 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/c2d6322f-95db-4f57-bbc6-1dfd2a023536-kube-api-access-qjqh5\") pod \"auto-csr-approver-29553734-qdrg7\" (UID: \"c2d6322f-95db-4f57-bbc6-1dfd2a023536\") " pod="openshift-infra/auto-csr-approver-29553734-qdrg7" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.466286 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553734-qdrg7" Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.972166 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553734-qdrg7"] Mar 11 10:14:00 crc kubenswrapper[4830]: I0311 10:14:00.978720 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 10:14:01 crc kubenswrapper[4830]: I0311 10:14:01.513611 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553734-qdrg7" event={"ID":"c2d6322f-95db-4f57-bbc6-1dfd2a023536","Type":"ContainerStarted","Data":"dfc52f945cbf4c142c1edbbd3369b5f8ac04f029d7a1fb5f023b0781f9a0cbeb"} Mar 11 10:14:03 crc kubenswrapper[4830]: I0311 10:14:03.530182 4830 generic.go:334] "Generic (PLEG): container finished" podID="c2d6322f-95db-4f57-bbc6-1dfd2a023536" containerID="8b80bc7215e2511c0f59c2b82bdbe8dbacf517badbe2330f90b029366c7c373a" exitCode=0 Mar 11 10:14:03 crc kubenswrapper[4830]: I0311 10:14:03.530233 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553734-qdrg7" event={"ID":"c2d6322f-95db-4f57-bbc6-1dfd2a023536","Type":"ContainerDied","Data":"8b80bc7215e2511c0f59c2b82bdbe8dbacf517badbe2330f90b029366c7c373a"} Mar 11 10:14:04 crc kubenswrapper[4830]: I0311 10:14:04.878860 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553734-qdrg7" Mar 11 10:14:04 crc kubenswrapper[4830]: I0311 10:14:04.902189 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/c2d6322f-95db-4f57-bbc6-1dfd2a023536-kube-api-access-qjqh5\") pod \"c2d6322f-95db-4f57-bbc6-1dfd2a023536\" (UID: \"c2d6322f-95db-4f57-bbc6-1dfd2a023536\") " Mar 11 10:14:04 crc kubenswrapper[4830]: I0311 10:14:04.908191 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d6322f-95db-4f57-bbc6-1dfd2a023536-kube-api-access-qjqh5" (OuterVolumeSpecName: "kube-api-access-qjqh5") pod "c2d6322f-95db-4f57-bbc6-1dfd2a023536" (UID: "c2d6322f-95db-4f57-bbc6-1dfd2a023536"). InnerVolumeSpecName "kube-api-access-qjqh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:14:05 crc kubenswrapper[4830]: I0311 10:14:05.004971 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjqh5\" (UniqueName: \"kubernetes.io/projected/c2d6322f-95db-4f57-bbc6-1dfd2a023536-kube-api-access-qjqh5\") on node \"crc\" DevicePath \"\"" Mar 11 10:14:05 crc kubenswrapper[4830]: I0311 10:14:05.550305 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553734-qdrg7" event={"ID":"c2d6322f-95db-4f57-bbc6-1dfd2a023536","Type":"ContainerDied","Data":"dfc52f945cbf4c142c1edbbd3369b5f8ac04f029d7a1fb5f023b0781f9a0cbeb"} Mar 11 10:14:05 crc kubenswrapper[4830]: I0311 10:14:05.550369 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc52f945cbf4c142c1edbbd3369b5f8ac04f029d7a1fb5f023b0781f9a0cbeb" Mar 11 10:14:05 crc kubenswrapper[4830]: I0311 10:14:05.550427 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553734-qdrg7" Mar 11 10:14:05 crc kubenswrapper[4830]: I0311 10:14:05.948517 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553728-v7dwh"] Mar 11 10:14:05 crc kubenswrapper[4830]: I0311 10:14:05.958113 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553728-v7dwh"] Mar 11 10:14:06 crc kubenswrapper[4830]: I0311 10:14:06.943864 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47589ba4-9dbf-430b-b1f2-c1507faeab74" path="/var/lib/kubelet/pods/47589ba4-9dbf-430b-b1f2-c1507faeab74/volumes" Mar 11 10:14:13 crc kubenswrapper[4830]: I0311 10:14:13.060247 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:14:13 crc kubenswrapper[4830]: I0311 10:14:13.060851 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:14:13 crc kubenswrapper[4830]: I0311 10:14:13.725374 4830 scope.go:117] "RemoveContainer" containerID="f250a6e273286fdb98ef1e5df4c3044cb79a3d7b4e7b3a11a1a566ed785cd1d1" Mar 11 10:14:27 crc kubenswrapper[4830]: I0311 10:14:27.298872 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qg75l_29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be/kube-rbac-proxy/0.log" Mar 11 10:14:27 crc kubenswrapper[4830]: I0311 10:14:27.471407 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qg75l_29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be/controller/0.log" Mar 11 10:14:27 crc kubenswrapper[4830]: I0311 10:14:27.736468 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:14:27 crc kubenswrapper[4830]: I0311 10:14:27.937097 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:14:27 crc kubenswrapper[4830]: I0311 10:14:27.988840 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:14:27 crc kubenswrapper[4830]: I0311 10:14:27.992465 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.026210 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.254629 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.254633 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.267708 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.299373 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.441212 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.460744 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.502026 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/controller/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.508667 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.698702 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/frr-metrics/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.720669 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/kube-rbac-proxy/0.log" Mar 11 10:14:28 crc kubenswrapper[4830]: I0311 10:14:28.725306 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/kube-rbac-proxy-frr/0.log" Mar 11 10:14:29 crc kubenswrapper[4830]: I0311 10:14:29.100412 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/reloader/0.log" Mar 11 10:14:29 crc kubenswrapper[4830]: I0311 10:14:29.175993 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-svpm2_bd1c2f0c-c126-4cd9-863d-6ec94f3920ba/frr-k8s-webhook-server/0.log" Mar 11 10:14:29 crc kubenswrapper[4830]: I0311 10:14:29.547799 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85c56b6668-mc4fj_e3733633-d23b-4ef9-90cf-89614677589d/manager/0.log" Mar 11 10:14:29 crc kubenswrapper[4830]: I0311 10:14:29.704542 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d9865c9bc-zgs5r_847b6273-e498-4025-a834-41173cfce564/webhook-server/0.log" Mar 11 10:14:29 crc kubenswrapper[4830]: I0311 10:14:29.852500 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gmrbg_08cd58a8-ee9e-44a8-874f-2187733e6d57/kube-rbac-proxy/0.log" Mar 11 10:14:30 crc kubenswrapper[4830]: I0311 10:14:30.293565 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/frr/0.log" Mar 11 10:14:30 crc kubenswrapper[4830]: I0311 10:14:30.332535 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gmrbg_08cd58a8-ee9e-44a8-874f-2187733e6d57/speaker/0.log" Mar 11 10:14:42 crc kubenswrapper[4830]: I0311 10:14:42.876230 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/util/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.060903 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.061282 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.061333 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.062086 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.062146 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" gracePeriod=600 Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.115176 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/pull/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.144178 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/util/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.173617 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/pull/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: E0311 10:14:43.199881 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.375810 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/util/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.401140 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/pull/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.419690 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/extract/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.530954 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/util/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.759027 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/util/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.788001 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/pull/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.817076 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/pull/0.log" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.921679 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" exitCode=0 Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.921733 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2"} Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.921772 4830 scope.go:117] "RemoveContainer" containerID="39b77e2518e07260183281640d393fb923092c241d46b9fda5255d21c4ba975d" Mar 11 10:14:43 crc kubenswrapper[4830]: I0311 10:14:43.922534 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:14:43 crc kubenswrapper[4830]: E0311 10:14:43.922914 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.017423 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/extract/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.052795 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/util/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.053044 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/pull/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.224160 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-utilities/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.424569 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-content/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.455513 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-content/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.465971 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-utilities/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.627341 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-content/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.648190 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-utilities/0.log" Mar 11 10:14:44 crc kubenswrapper[4830]: I0311 10:14:44.862981 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-utilities/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.011956 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/registry-server/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.055940 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-content/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.077517 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-utilities/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.108974 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-content/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.424731 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-utilities/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.439675 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-content/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.670923 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-vjsls_74cca8dd-8cb4-41db-9626-9612877ad60e/marketplace-operator/0.log" Mar 11 10:14:45 crc kubenswrapper[4830]: I0311 10:14:45.748866 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-utilities/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.024410 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-utilities/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.081729 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-content/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.107356 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-content/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.169067 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/registry-server/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.255327 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-content/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.292069 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-utilities/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.482192 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/registry-server/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.556954 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-utilities/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.734359 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-utilities/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.742471 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-content/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.795528 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-content/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.892963 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-content/0.log" Mar 11 10:14:46 crc kubenswrapper[4830]: I0311 10:14:46.962230 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-utilities/0.log" Mar 11 10:14:47 crc kubenswrapper[4830]: I0311 10:14:47.491994 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/registry-server/0.log" Mar 11 10:14:58 crc kubenswrapper[4830]: I0311 10:14:58.932579 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:14:58 crc kubenswrapper[4830]: E0311 10:14:58.933344 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.149949 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh"] Mar 11 10:15:00 crc kubenswrapper[4830]: E0311 10:15:00.150476 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d6322f-95db-4f57-bbc6-1dfd2a023536" containerName="oc" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.150499 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d6322f-95db-4f57-bbc6-1dfd2a023536" containerName="oc" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.150736 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d6322f-95db-4f57-bbc6-1dfd2a023536" containerName="oc" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.151595 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.154769 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.155085 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.163686 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh"] Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.234893 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-secret-volume\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.235292 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-config-volume\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.235372 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnmwb\" (UniqueName: \"kubernetes.io/projected/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-kube-api-access-gnmwb\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.337408 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-secret-volume\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.337451 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-config-volume\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.337495 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnmwb\" (UniqueName: \"kubernetes.io/projected/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-kube-api-access-gnmwb\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.338391 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-config-volume\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.350726 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-secret-volume\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.353410 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnmwb\" (UniqueName: \"kubernetes.io/projected/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-kube-api-access-gnmwb\") pod \"collect-profiles-29553735-szjzh\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:00 crc kubenswrapper[4830]: I0311 10:15:00.482259 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:01 crc kubenswrapper[4830]: I0311 10:15:01.029164 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh"] Mar 11 10:15:01 crc kubenswrapper[4830]: I0311 10:15:01.105879 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" event={"ID":"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9","Type":"ContainerStarted","Data":"4bdaefa1b7d4b737cd0767ca6e8262ffec62fb29335ad2d79cd876feaa7a1e65"} Mar 11 10:15:02 crc kubenswrapper[4830]: I0311 10:15:02.121140 4830 generic.go:334] "Generic (PLEG): container finished" podID="7510a0e9-1c7b-4aaf-bb53-df9be63a12a9" containerID="0378468db514c284ad769701ea72e5ed683c2ed38d275e6a07c9c05975670b06" exitCode=0 Mar 11 10:15:02 crc kubenswrapper[4830]: I0311 10:15:02.121247 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" event={"ID":"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9","Type":"ContainerDied","Data":"0378468db514c284ad769701ea72e5ed683c2ed38d275e6a07c9c05975670b06"} Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.640192 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.708396 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-config-volume\") pod \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.708474 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnmwb\" (UniqueName: \"kubernetes.io/projected/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-kube-api-access-gnmwb\") pod \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.708648 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-secret-volume\") pod \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\" (UID: \"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9\") " Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.709565 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "7510a0e9-1c7b-4aaf-bb53-df9be63a12a9" (UID: "7510a0e9-1c7b-4aaf-bb53-df9be63a12a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.716826 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7510a0e9-1c7b-4aaf-bb53-df9be63a12a9" (UID: "7510a0e9-1c7b-4aaf-bb53-df9be63a12a9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.720699 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-kube-api-access-gnmwb" (OuterVolumeSpecName: "kube-api-access-gnmwb") pod "7510a0e9-1c7b-4aaf-bb53-df9be63a12a9" (UID: "7510a0e9-1c7b-4aaf-bb53-df9be63a12a9"). InnerVolumeSpecName "kube-api-access-gnmwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.812807 4830 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.812852 4830 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 11 10:15:03 crc kubenswrapper[4830]: I0311 10:15:03.812868 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnmwb\" (UniqueName: \"kubernetes.io/projected/7510a0e9-1c7b-4aaf-bb53-df9be63a12a9-kube-api-access-gnmwb\") on node \"crc\" DevicePath \"\"" Mar 11 10:15:04 crc kubenswrapper[4830]: I0311 10:15:04.157803 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" event={"ID":"7510a0e9-1c7b-4aaf-bb53-df9be63a12a9","Type":"ContainerDied","Data":"4bdaefa1b7d4b737cd0767ca6e8262ffec62fb29335ad2d79cd876feaa7a1e65"} Mar 11 10:15:04 crc kubenswrapper[4830]: I0311 10:15:04.158212 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29553735-szjzh" Mar 11 10:15:04 crc kubenswrapper[4830]: I0311 10:15:04.158226 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bdaefa1b7d4b737cd0767ca6e8262ffec62fb29335ad2d79cd876feaa7a1e65" Mar 11 10:15:04 crc kubenswrapper[4830]: I0311 10:15:04.739956 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg"] Mar 11 10:15:04 crc kubenswrapper[4830]: I0311 10:15:04.752413 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29553690-6cbfg"] Mar 11 10:15:04 crc kubenswrapper[4830]: I0311 10:15:04.944399 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a48921cf-d963-44f0-85ac-224081bf9848" path="/var/lib/kubelet/pods/a48921cf-d963-44f0-85ac-224081bf9848/volumes" Mar 11 10:15:09 crc kubenswrapper[4830]: I0311 10:15:09.933058 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:15:09 crc kubenswrapper[4830]: E0311 10:15:09.933852 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:15:13 crc kubenswrapper[4830]: I0311 10:15:13.815980 4830 scope.go:117] "RemoveContainer" containerID="5c1c38b6c7344cc9d9855a8ca2f10df747870f247e36fe59c489d0a1808a6480" Mar 11 10:15:23 crc kubenswrapper[4830]: I0311 10:15:23.932590 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:15:23 crc kubenswrapper[4830]: E0311 10:15:23.933372 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:15:34 crc kubenswrapper[4830]: I0311 10:15:34.935275 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:15:34 crc kubenswrapper[4830]: E0311 10:15:34.936331 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:15:46 crc kubenswrapper[4830]: I0311 10:15:46.933178 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:15:46 crc kubenswrapper[4830]: E0311 10:15:46.933978 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.144580 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553736-dpcrm"] Mar 11 10:16:00 crc kubenswrapper[4830]: E0311 10:16:00.145790 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7510a0e9-1c7b-4aaf-bb53-df9be63a12a9" containerName="collect-profiles" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.145809 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="7510a0e9-1c7b-4aaf-bb53-df9be63a12a9" containerName="collect-profiles" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.146047 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="7510a0e9-1c7b-4aaf-bb53-df9be63a12a9" containerName="collect-profiles" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.146855 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553736-dpcrm" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.149908 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.150239 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.150756 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.158860 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553736-dpcrm"] Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.315757 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq2pz\" (UniqueName: \"kubernetes.io/projected/8d6a38b8-7842-4127-9926-e477ab93d5d6-kube-api-access-vq2pz\") pod \"auto-csr-approver-29553736-dpcrm\" (UID: \"8d6a38b8-7842-4127-9926-e477ab93d5d6\") " pod="openshift-infra/auto-csr-approver-29553736-dpcrm" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.418641 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq2pz\" (UniqueName: \"kubernetes.io/projected/8d6a38b8-7842-4127-9926-e477ab93d5d6-kube-api-access-vq2pz\") pod \"auto-csr-approver-29553736-dpcrm\" (UID: \"8d6a38b8-7842-4127-9926-e477ab93d5d6\") " pod="openshift-infra/auto-csr-approver-29553736-dpcrm" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.443132 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq2pz\" (UniqueName: \"kubernetes.io/projected/8d6a38b8-7842-4127-9926-e477ab93d5d6-kube-api-access-vq2pz\") pod \"auto-csr-approver-29553736-dpcrm\" (UID: \"8d6a38b8-7842-4127-9926-e477ab93d5d6\") " pod="openshift-infra/auto-csr-approver-29553736-dpcrm" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.493182 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553736-dpcrm" Mar 11 10:16:00 crc kubenswrapper[4830]: I0311 10:16:00.951545 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553736-dpcrm"] Mar 11 10:16:01 crc kubenswrapper[4830]: I0311 10:16:01.673383 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553736-dpcrm" event={"ID":"8d6a38b8-7842-4127-9926-e477ab93d5d6","Type":"ContainerStarted","Data":"c9c62a1ac9d4fe1289b30e4dc87c3978691650b0319fdc7e3cf0ac836396e556"} Mar 11 10:16:01 crc kubenswrapper[4830]: I0311 10:16:01.933995 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:16:01 crc kubenswrapper[4830]: E0311 10:16:01.934442 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:16:02 crc kubenswrapper[4830]: I0311 10:16:02.685554 4830 generic.go:334] "Generic (PLEG): container finished" podID="8d6a38b8-7842-4127-9926-e477ab93d5d6" containerID="732cf52308a6eb0823c629e97f48bfa4f3b65cb5f5077133cb6c8827f288954f" exitCode=0 Mar 11 10:16:02 crc kubenswrapper[4830]: I0311 10:16:02.685726 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553736-dpcrm" event={"ID":"8d6a38b8-7842-4127-9926-e477ab93d5d6","Type":"ContainerDied","Data":"732cf52308a6eb0823c629e97f48bfa4f3b65cb5f5077133cb6c8827f288954f"} Mar 11 10:16:04 crc kubenswrapper[4830]: I0311 10:16:04.170585 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553736-dpcrm" Mar 11 10:16:04 crc kubenswrapper[4830]: I0311 10:16:04.207314 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq2pz\" (UniqueName: \"kubernetes.io/projected/8d6a38b8-7842-4127-9926-e477ab93d5d6-kube-api-access-vq2pz\") pod \"8d6a38b8-7842-4127-9926-e477ab93d5d6\" (UID: \"8d6a38b8-7842-4127-9926-e477ab93d5d6\") " Mar 11 10:16:04 crc kubenswrapper[4830]: I0311 10:16:04.212805 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6a38b8-7842-4127-9926-e477ab93d5d6-kube-api-access-vq2pz" (OuterVolumeSpecName: "kube-api-access-vq2pz") pod "8d6a38b8-7842-4127-9926-e477ab93d5d6" (UID: "8d6a38b8-7842-4127-9926-e477ab93d5d6"). InnerVolumeSpecName "kube-api-access-vq2pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:16:04 crc kubenswrapper[4830]: I0311 10:16:04.309009 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq2pz\" (UniqueName: \"kubernetes.io/projected/8d6a38b8-7842-4127-9926-e477ab93d5d6-kube-api-access-vq2pz\") on node \"crc\" DevicePath \"\"" Mar 11 10:16:04 crc kubenswrapper[4830]: I0311 10:16:04.722182 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553736-dpcrm" event={"ID":"8d6a38b8-7842-4127-9926-e477ab93d5d6","Type":"ContainerDied","Data":"c9c62a1ac9d4fe1289b30e4dc87c3978691650b0319fdc7e3cf0ac836396e556"} Mar 11 10:16:04 crc kubenswrapper[4830]: I0311 10:16:04.722233 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c62a1ac9d4fe1289b30e4dc87c3978691650b0319fdc7e3cf0ac836396e556" Mar 11 10:16:04 crc kubenswrapper[4830]: I0311 10:16:04.722233 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553736-dpcrm" Mar 11 10:16:05 crc kubenswrapper[4830]: I0311 10:16:05.245842 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553730-r2zzj"] Mar 11 10:16:05 crc kubenswrapper[4830]: I0311 10:16:05.253953 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553730-r2zzj"] Mar 11 10:16:06 crc kubenswrapper[4830]: I0311 10:16:06.948842 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b362a55-c049-48f8-9c1c-ed0f66ef4bc4" path="/var/lib/kubelet/pods/2b362a55-c049-48f8-9c1c-ed0f66ef4bc4/volumes" Mar 11 10:16:13 crc kubenswrapper[4830]: I0311 10:16:13.927899 4830 scope.go:117] "RemoveContainer" containerID="bc33086c0384153e4204af1273c8a6fde7849a557a44b575c216d0b2aa047d6d" Mar 11 10:16:16 crc kubenswrapper[4830]: I0311 10:16:16.932249 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:16:16 crc kubenswrapper[4830]: E0311 10:16:16.932853 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:16:28 crc kubenswrapper[4830]: I0311 10:16:28.932973 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:16:28 crc kubenswrapper[4830]: E0311 10:16:28.933780 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:16:39 crc kubenswrapper[4830]: I0311 10:16:39.052574 4830 generic.go:334] "Generic (PLEG): container finished" podID="98e5de6b-39e2-468d-a621-264332585f2f" containerID="42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c" exitCode=0 Mar 11 10:16:39 crc kubenswrapper[4830]: I0311 10:16:39.052624 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-znshr/must-gather-wm52g" event={"ID":"98e5de6b-39e2-468d-a621-264332585f2f","Type":"ContainerDied","Data":"42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c"} Mar 11 10:16:39 crc kubenswrapper[4830]: I0311 10:16:39.054964 4830 scope.go:117] "RemoveContainer" containerID="42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c" Mar 11 10:16:39 crc kubenswrapper[4830]: I0311 10:16:39.553777 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znshr_must-gather-wm52g_98e5de6b-39e2-468d-a621-264332585f2f/gather/0.log" Mar 11 10:16:39 crc kubenswrapper[4830]: I0311 10:16:39.933177 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:16:39 crc kubenswrapper[4830]: E0311 10:16:39.933727 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:16:47 crc kubenswrapper[4830]: I0311 10:16:47.573038 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-znshr/must-gather-wm52g"] Mar 11 10:16:47 crc kubenswrapper[4830]: I0311 10:16:47.573767 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-znshr/must-gather-wm52g" podUID="98e5de6b-39e2-468d-a621-264332585f2f" containerName="copy" containerID="cri-o://795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b" gracePeriod=2 Mar 11 10:16:47 crc kubenswrapper[4830]: I0311 10:16:47.585817 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-znshr/must-gather-wm52g"] Mar 11 10:16:47 crc kubenswrapper[4830]: E0311 10:16:47.693897 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e5de6b_39e2_468d_a621_264332585f2f.slice/crio-conmon-795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b.scope\": RecentStats: unable to find data in memory cache]" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.046643 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znshr_must-gather-wm52g_98e5de6b-39e2-468d-a621-264332585f2f/copy/0.log" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.047868 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.131634 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-znshr_must-gather-wm52g_98e5de6b-39e2-468d-a621-264332585f2f/copy/0.log" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.131952 4830 generic.go:334] "Generic (PLEG): container finished" podID="98e5de6b-39e2-468d-a621-264332585f2f" containerID="795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b" exitCode=143 Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.131996 4830 scope.go:117] "RemoveContainer" containerID="795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.132154 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-znshr/must-gather-wm52g" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.156310 4830 scope.go:117] "RemoveContainer" containerID="42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.167242 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98e5de6b-39e2-468d-a621-264332585f2f-must-gather-output\") pod \"98e5de6b-39e2-468d-a621-264332585f2f\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.167543 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwgml\" (UniqueName: \"kubernetes.io/projected/98e5de6b-39e2-468d-a621-264332585f2f-kube-api-access-kwgml\") pod \"98e5de6b-39e2-468d-a621-264332585f2f\" (UID: \"98e5de6b-39e2-468d-a621-264332585f2f\") " Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.173154 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e5de6b-39e2-468d-a621-264332585f2f-kube-api-access-kwgml" (OuterVolumeSpecName: "kube-api-access-kwgml") pod "98e5de6b-39e2-468d-a621-264332585f2f" (UID: "98e5de6b-39e2-468d-a621-264332585f2f"). InnerVolumeSpecName "kube-api-access-kwgml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.230501 4830 scope.go:117] "RemoveContainer" containerID="795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b" Mar 11 10:16:48 crc kubenswrapper[4830]: E0311 10:16:48.230973 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b\": container with ID starting with 795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b not found: ID does not exist" containerID="795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.231104 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b"} err="failed to get container status \"795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b\": rpc error: code = NotFound desc = could not find container \"795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b\": container with ID starting with 795dde55becf3a62e1eaf2a732fb2f090863b0b8ce116a5c57001bd43b24a20b not found: ID does not exist" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.231190 4830 scope.go:117] "RemoveContainer" containerID="42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c" Mar 11 10:16:48 crc kubenswrapper[4830]: E0311 10:16:48.231586 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c\": container with ID starting with 42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c not found: ID does not exist" containerID="42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.231619 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c"} err="failed to get container status \"42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c\": rpc error: code = NotFound desc = could not find container \"42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c\": container with ID starting with 42431aff00e76719f53bda5860f2a1aa491a37c0acaf2047ce8b8cdd1c45203c not found: ID does not exist" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.269761 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwgml\" (UniqueName: \"kubernetes.io/projected/98e5de6b-39e2-468d-a621-264332585f2f-kube-api-access-kwgml\") on node \"crc\" DevicePath \"\"" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.332444 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e5de6b-39e2-468d-a621-264332585f2f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "98e5de6b-39e2-468d-a621-264332585f2f" (UID: "98e5de6b-39e2-468d-a621-264332585f2f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.374621 4830 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98e5de6b-39e2-468d-a621-264332585f2f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.880274 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5vgfq"] Mar 11 10:16:48 crc kubenswrapper[4830]: E0311 10:16:48.880869 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e5de6b-39e2-468d-a621-264332585f2f" containerName="copy" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.880883 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e5de6b-39e2-468d-a621-264332585f2f" containerName="copy" Mar 11 10:16:48 crc kubenswrapper[4830]: E0311 10:16:48.880909 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e5de6b-39e2-468d-a621-264332585f2f" containerName="gather" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.880916 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e5de6b-39e2-468d-a621-264332585f2f" containerName="gather" Mar 11 10:16:48 crc kubenswrapper[4830]: E0311 10:16:48.880931 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6a38b8-7842-4127-9926-e477ab93d5d6" containerName="oc" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.880937 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6a38b8-7842-4127-9926-e477ab93d5d6" containerName="oc" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.881180 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e5de6b-39e2-468d-a621-264332585f2f" containerName="gather" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.881204 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e5de6b-39e2-468d-a621-264332585f2f" containerName="copy" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.881215 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6a38b8-7842-4127-9926-e477ab93d5d6" containerName="oc" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.882643 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.891875 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vgfq"] Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.945119 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e5de6b-39e2-468d-a621-264332585f2f" path="/var/lib/kubelet/pods/98e5de6b-39e2-468d-a621-264332585f2f/volumes" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.987117 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-catalog-content\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.987401 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-utilities\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:48 crc kubenswrapper[4830]: I0311 10:16:48.987666 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvpwr\" (UniqueName: \"kubernetes.io/projected/25b46383-2c3c-45f6-b591-0e786de58bc5-kube-api-access-nvpwr\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.089917 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvpwr\" (UniqueName: \"kubernetes.io/projected/25b46383-2c3c-45f6-b591-0e786de58bc5-kube-api-access-nvpwr\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.090268 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-catalog-content\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.090446 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-utilities\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.090831 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-catalog-content\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.090868 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-utilities\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.109642 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvpwr\" (UniqueName: \"kubernetes.io/projected/25b46383-2c3c-45f6-b591-0e786de58bc5-kube-api-access-nvpwr\") pod \"community-operators-5vgfq\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.203240 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:49 crc kubenswrapper[4830]: I0311 10:16:49.725566 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vgfq"] Mar 11 10:16:50 crc kubenswrapper[4830]: I0311 10:16:50.155822 4830 generic.go:334] "Generic (PLEG): container finished" podID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerID="fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee" exitCode=0 Mar 11 10:16:50 crc kubenswrapper[4830]: I0311 10:16:50.155884 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vgfq" event={"ID":"25b46383-2c3c-45f6-b591-0e786de58bc5","Type":"ContainerDied","Data":"fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee"} Mar 11 10:16:50 crc kubenswrapper[4830]: I0311 10:16:50.156270 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vgfq" event={"ID":"25b46383-2c3c-45f6-b591-0e786de58bc5","Type":"ContainerStarted","Data":"1c21abc6d62dd39882ebe7c17b488ab6b90155e9964c28adbd14674458dd0b4e"} Mar 11 10:16:51 crc kubenswrapper[4830]: I0311 10:16:51.165827 4830 generic.go:334] "Generic (PLEG): container finished" podID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerID="0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e" exitCode=0 Mar 11 10:16:51 crc kubenswrapper[4830]: I0311 10:16:51.165884 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vgfq" event={"ID":"25b46383-2c3c-45f6-b591-0e786de58bc5","Type":"ContainerDied","Data":"0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e"} Mar 11 10:16:52 crc kubenswrapper[4830]: I0311 10:16:52.185758 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vgfq" event={"ID":"25b46383-2c3c-45f6-b591-0e786de58bc5","Type":"ContainerStarted","Data":"c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8"} Mar 11 10:16:52 crc kubenswrapper[4830]: I0311 10:16:52.222531 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5vgfq" podStartSLOduration=2.714106808 podStartE2EDuration="4.222490731s" podCreationTimestamp="2026-03-11 10:16:48 +0000 UTC" firstStartedPulling="2026-03-11 10:16:50.15851095 +0000 UTC m=+3777.939661639" lastFinishedPulling="2026-03-11 10:16:51.666894873 +0000 UTC m=+3779.448045562" observedRunningTime="2026-03-11 10:16:52.20739028 +0000 UTC m=+3779.988540999" watchObservedRunningTime="2026-03-11 10:16:52.222490731 +0000 UTC m=+3780.003641420" Mar 11 10:16:52 crc kubenswrapper[4830]: I0311 10:16:52.940409 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:16:52 crc kubenswrapper[4830]: E0311 10:16:52.940912 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:16:59 crc kubenswrapper[4830]: I0311 10:16:59.204106 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:59 crc kubenswrapper[4830]: I0311 10:16:59.204853 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:59 crc kubenswrapper[4830]: I0311 10:16:59.270207 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:59 crc kubenswrapper[4830]: I0311 10:16:59.328212 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:16:59 crc kubenswrapper[4830]: I0311 10:16:59.507134 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vgfq"] Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.374800 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5vgfq" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="registry-server" containerID="cri-o://c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8" gracePeriod=2 Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.800438 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.838315 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-utilities\") pod \"25b46383-2c3c-45f6-b591-0e786de58bc5\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.838389 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-catalog-content\") pod \"25b46383-2c3c-45f6-b591-0e786de58bc5\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.838456 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvpwr\" (UniqueName: \"kubernetes.io/projected/25b46383-2c3c-45f6-b591-0e786de58bc5-kube-api-access-nvpwr\") pod \"25b46383-2c3c-45f6-b591-0e786de58bc5\" (UID: \"25b46383-2c3c-45f6-b591-0e786de58bc5\") " Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.840533 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-utilities" (OuterVolumeSpecName: "utilities") pod "25b46383-2c3c-45f6-b591-0e786de58bc5" (UID: "25b46383-2c3c-45f6-b591-0e786de58bc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.850256 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b46383-2c3c-45f6-b591-0e786de58bc5-kube-api-access-nvpwr" (OuterVolumeSpecName: "kube-api-access-nvpwr") pod "25b46383-2c3c-45f6-b591-0e786de58bc5" (UID: "25b46383-2c3c-45f6-b591-0e786de58bc5"). InnerVolumeSpecName "kube-api-access-nvpwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.899795 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25b46383-2c3c-45f6-b591-0e786de58bc5" (UID: "25b46383-2c3c-45f6-b591-0e786de58bc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.940705 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.940752 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25b46383-2c3c-45f6-b591-0e786de58bc5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:17:02 crc kubenswrapper[4830]: I0311 10:17:02.940764 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvpwr\" (UniqueName: \"kubernetes.io/projected/25b46383-2c3c-45f6-b591-0e786de58bc5-kube-api-access-nvpwr\") on node \"crc\" DevicePath \"\"" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.386403 4830 generic.go:334] "Generic (PLEG): container finished" podID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerID="c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8" exitCode=0 Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.386474 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vgfq" event={"ID":"25b46383-2c3c-45f6-b591-0e786de58bc5","Type":"ContainerDied","Data":"c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8"} Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.386496 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vgfq" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.386776 4830 scope.go:117] "RemoveContainer" containerID="c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.386761 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vgfq" event={"ID":"25b46383-2c3c-45f6-b591-0e786de58bc5","Type":"ContainerDied","Data":"1c21abc6d62dd39882ebe7c17b488ab6b90155e9964c28adbd14674458dd0b4e"} Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.410222 4830 scope.go:117] "RemoveContainer" containerID="0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.420371 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vgfq"] Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.428594 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5vgfq"] Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.439432 4830 scope.go:117] "RemoveContainer" containerID="fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.539118 4830 scope.go:117] "RemoveContainer" containerID="c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8" Mar 11 10:17:03 crc kubenswrapper[4830]: E0311 10:17:03.539663 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8\": container with ID starting with c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8 not found: ID does not exist" containerID="c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.539710 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8"} err="failed to get container status \"c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8\": rpc error: code = NotFound desc = could not find container \"c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8\": container with ID starting with c822ae8f3a8c30b107608a157a865482162bb81a40c8a1797ec82db3688d00b8 not found: ID does not exist" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.539736 4830 scope.go:117] "RemoveContainer" containerID="0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e" Mar 11 10:17:03 crc kubenswrapper[4830]: E0311 10:17:03.540161 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e\": container with ID starting with 0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e not found: ID does not exist" containerID="0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.540184 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e"} err="failed to get container status \"0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e\": rpc error: code = NotFound desc = could not find container \"0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e\": container with ID starting with 0bc325b1205141b70beb1175bd9ecbe79d89a39bed0ed754b2303de33f52ac5e not found: ID does not exist" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.540199 4830 scope.go:117] "RemoveContainer" containerID="fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee" Mar 11 10:17:03 crc kubenswrapper[4830]: E0311 10:17:03.540505 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee\": container with ID starting with fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee not found: ID does not exist" containerID="fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee" Mar 11 10:17:03 crc kubenswrapper[4830]: I0311 10:17:03.540547 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee"} err="failed to get container status \"fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee\": rpc error: code = NotFound desc = could not find container \"fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee\": container with ID starting with fe38089c84b9ebb9b58be8b0093a8f24b16da38c7f55c3848cfe499485f737ee not found: ID does not exist" Mar 11 10:17:04 crc kubenswrapper[4830]: I0311 10:17:04.946908 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" path="/var/lib/kubelet/pods/25b46383-2c3c-45f6-b591-0e786de58bc5/volumes" Mar 11 10:17:07 crc kubenswrapper[4830]: I0311 10:17:07.933307 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:17:07 crc kubenswrapper[4830]: E0311 10:17:07.934927 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:17:14 crc kubenswrapper[4830]: I0311 10:17:13.999652 4830 scope.go:117] "RemoveContainer" containerID="48f2aa0f9059bebc618a1e98daa13ab01c37b50131e3b680e205ba672593047c" Mar 11 10:17:14 crc kubenswrapper[4830]: I0311 10:17:14.024709 4830 scope.go:117] "RemoveContainer" containerID="47cd0c1bbc45df87ed660e61b86ff49422709e04aca310244a00abf0e4b96ae7" Mar 11 10:17:19 crc kubenswrapper[4830]: I0311 10:17:19.932510 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:17:19 crc kubenswrapper[4830]: E0311 10:17:19.933431 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:17:32 crc kubenswrapper[4830]: I0311 10:17:32.938704 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:17:32 crc kubenswrapper[4830]: E0311 10:17:32.939463 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:17:46 crc kubenswrapper[4830]: I0311 10:17:46.932943 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:17:46 crc kubenswrapper[4830]: E0311 10:17:46.933828 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.149482 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553738-5m2l9"] Mar 11 10:18:00 crc kubenswrapper[4830]: E0311 10:18:00.150704 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="extract-utilities" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.150724 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="extract-utilities" Mar 11 10:18:00 crc kubenswrapper[4830]: E0311 10:18:00.150744 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="extract-content" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.150752 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="extract-content" Mar 11 10:18:00 crc kubenswrapper[4830]: E0311 10:18:00.150784 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="registry-server" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.150793 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="registry-server" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.151039 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b46383-2c3c-45f6-b591-0e786de58bc5" containerName="registry-server" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.151893 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553738-5m2l9" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.157102 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.157336 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.160279 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553738-5m2l9"] Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.162954 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.165852 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5l9g\" (UniqueName: \"kubernetes.io/projected/fd741083-a35c-4930-a1fa-c16df13d01af-kube-api-access-n5l9g\") pod \"auto-csr-approver-29553738-5m2l9\" (UID: \"fd741083-a35c-4930-a1fa-c16df13d01af\") " pod="openshift-infra/auto-csr-approver-29553738-5m2l9" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.268282 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5l9g\" (UniqueName: \"kubernetes.io/projected/fd741083-a35c-4930-a1fa-c16df13d01af-kube-api-access-n5l9g\") pod \"auto-csr-approver-29553738-5m2l9\" (UID: \"fd741083-a35c-4930-a1fa-c16df13d01af\") " pod="openshift-infra/auto-csr-approver-29553738-5m2l9" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.286368 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5l9g\" (UniqueName: \"kubernetes.io/projected/fd741083-a35c-4930-a1fa-c16df13d01af-kube-api-access-n5l9g\") pod \"auto-csr-approver-29553738-5m2l9\" (UID: \"fd741083-a35c-4930-a1fa-c16df13d01af\") " pod="openshift-infra/auto-csr-approver-29553738-5m2l9" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.475861 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553738-5m2l9" Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.908166 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553738-5m2l9"] Mar 11 10:18:00 crc kubenswrapper[4830]: I0311 10:18:00.933294 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:18:00 crc kubenswrapper[4830]: E0311 10:18:00.933582 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:18:01 crc kubenswrapper[4830]: I0311 10:18:01.891618 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553738-5m2l9" event={"ID":"fd741083-a35c-4930-a1fa-c16df13d01af","Type":"ContainerStarted","Data":"00200bc0d380ede4faee1dc2ea6ed668598ad2d46e7fa48a56a15795d0484750"} Mar 11 10:18:02 crc kubenswrapper[4830]: I0311 10:18:02.901338 4830 generic.go:334] "Generic (PLEG): container finished" podID="fd741083-a35c-4930-a1fa-c16df13d01af" containerID="b1e21e68b3d6621568ba6a422cd0477f13cfe36c88a3cf6439c31ac61f88a7c7" exitCode=0 Mar 11 10:18:02 crc kubenswrapper[4830]: I0311 10:18:02.901476 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553738-5m2l9" event={"ID":"fd741083-a35c-4930-a1fa-c16df13d01af","Type":"ContainerDied","Data":"b1e21e68b3d6621568ba6a422cd0477f13cfe36c88a3cf6439c31ac61f88a7c7"} Mar 11 10:18:04 crc kubenswrapper[4830]: I0311 10:18:04.231184 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553738-5m2l9" Mar 11 10:18:04 crc kubenswrapper[4830]: I0311 10:18:04.249003 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5l9g\" (UniqueName: \"kubernetes.io/projected/fd741083-a35c-4930-a1fa-c16df13d01af-kube-api-access-n5l9g\") pod \"fd741083-a35c-4930-a1fa-c16df13d01af\" (UID: \"fd741083-a35c-4930-a1fa-c16df13d01af\") " Mar 11 10:18:04 crc kubenswrapper[4830]: I0311 10:18:04.256708 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd741083-a35c-4930-a1fa-c16df13d01af-kube-api-access-n5l9g" (OuterVolumeSpecName: "kube-api-access-n5l9g") pod "fd741083-a35c-4930-a1fa-c16df13d01af" (UID: "fd741083-a35c-4930-a1fa-c16df13d01af"). InnerVolumeSpecName "kube-api-access-n5l9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:18:04 crc kubenswrapper[4830]: I0311 10:18:04.353902 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5l9g\" (UniqueName: \"kubernetes.io/projected/fd741083-a35c-4930-a1fa-c16df13d01af-kube-api-access-n5l9g\") on node \"crc\" DevicePath \"\"" Mar 11 10:18:04 crc kubenswrapper[4830]: I0311 10:18:04.921219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553738-5m2l9" event={"ID":"fd741083-a35c-4930-a1fa-c16df13d01af","Type":"ContainerDied","Data":"00200bc0d380ede4faee1dc2ea6ed668598ad2d46e7fa48a56a15795d0484750"} Mar 11 10:18:04 crc kubenswrapper[4830]: I0311 10:18:04.921262 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00200bc0d380ede4faee1dc2ea6ed668598ad2d46e7fa48a56a15795d0484750" Mar 11 10:18:04 crc kubenswrapper[4830]: I0311 10:18:04.921261 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553738-5m2l9" Mar 11 10:18:05 crc kubenswrapper[4830]: I0311 10:18:05.295348 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553732-fdthc"] Mar 11 10:18:05 crc kubenswrapper[4830]: I0311 10:18:05.303993 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553732-fdthc"] Mar 11 10:18:06 crc kubenswrapper[4830]: I0311 10:18:06.941995 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ed5e7e-173b-4140-ba24-3cdf1796f2ae" path="/var/lib/kubelet/pods/70ed5e7e-173b-4140-ba24-3cdf1796f2ae/volumes" Mar 11 10:18:14 crc kubenswrapper[4830]: I0311 10:18:14.140893 4830 scope.go:117] "RemoveContainer" containerID="685365cb764aebcbf9e30a927f1f7276048d557f0591d1293091263a6aa32896" Mar 11 10:18:14 crc kubenswrapper[4830]: I0311 10:18:14.186878 4830 scope.go:117] "RemoveContainer" containerID="2518246ca096fc674af64bd2d0fcae4bed583716b5aef898df4d00dd056c1610" Mar 11 10:18:14 crc kubenswrapper[4830]: I0311 10:18:14.933373 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:18:14 crc kubenswrapper[4830]: E0311 10:18:14.933945 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:18:26 crc kubenswrapper[4830]: I0311 10:18:26.932438 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:18:26 crc kubenswrapper[4830]: E0311 10:18:26.933402 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:18:39 crc kubenswrapper[4830]: I0311 10:18:39.932845 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:18:39 crc kubenswrapper[4830]: E0311 10:18:39.933637 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:18:51 crc kubenswrapper[4830]: I0311 10:18:51.932660 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:18:51 crc kubenswrapper[4830]: E0311 10:18:51.933489 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:19:02 crc kubenswrapper[4830]: I0311 10:19:02.938283 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:19:02 crc kubenswrapper[4830]: E0311 10:19:02.938990 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:19:13 crc kubenswrapper[4830]: I0311 10:19:13.933534 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:19:13 crc kubenswrapper[4830]: E0311 10:19:13.934390 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:19:28 crc kubenswrapper[4830]: I0311 10:19:28.936373 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:19:28 crc kubenswrapper[4830]: E0311 10:19:28.937211 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:19:40 crc kubenswrapper[4830]: I0311 10:19:40.933094 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:19:40 crc kubenswrapper[4830]: E0311 10:19:40.933941 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.127281 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlpqd/must-gather-8l2hl"] Mar 11 10:19:46 crc kubenswrapper[4830]: E0311 10:19:46.128197 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd741083-a35c-4930-a1fa-c16df13d01af" containerName="oc" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.128212 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd741083-a35c-4930-a1fa-c16df13d01af" containerName="oc" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.128417 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd741083-a35c-4930-a1fa-c16df13d01af" containerName="oc" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.139808 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wlpqd/must-gather-8l2hl"] Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.140035 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.145371 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wlpqd"/"default-dockercfg-2m9l5" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.145412 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wlpqd"/"kube-root-ca.crt" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.145597 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wlpqd"/"openshift-service-ca.crt" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.252191 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/50a6aad1-7754-45c2-8c6e-ce71e051cd18-must-gather-output\") pod \"must-gather-8l2hl\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.252314 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg9xs\" (UniqueName: \"kubernetes.io/projected/50a6aad1-7754-45c2-8c6e-ce71e051cd18-kube-api-access-cg9xs\") pod \"must-gather-8l2hl\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.354952 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg9xs\" (UniqueName: \"kubernetes.io/projected/50a6aad1-7754-45c2-8c6e-ce71e051cd18-kube-api-access-cg9xs\") pod \"must-gather-8l2hl\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.355676 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/50a6aad1-7754-45c2-8c6e-ce71e051cd18-must-gather-output\") pod \"must-gather-8l2hl\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.356186 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/50a6aad1-7754-45c2-8c6e-ce71e051cd18-must-gather-output\") pod \"must-gather-8l2hl\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.393541 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg9xs\" (UniqueName: \"kubernetes.io/projected/50a6aad1-7754-45c2-8c6e-ce71e051cd18-kube-api-access-cg9xs\") pod \"must-gather-8l2hl\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.462249 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:19:46 crc kubenswrapper[4830]: I0311 10:19:46.923622 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wlpqd/must-gather-8l2hl"] Mar 11 10:19:47 crc kubenswrapper[4830]: I0311 10:19:47.834090 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" event={"ID":"50a6aad1-7754-45c2-8c6e-ce71e051cd18","Type":"ContainerStarted","Data":"8c1b68978767b2d02795c43c983500b5fe2dfa17b8ca28047a25f63e15c1472a"} Mar 11 10:19:47 crc kubenswrapper[4830]: I0311 10:19:47.834361 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" event={"ID":"50a6aad1-7754-45c2-8c6e-ce71e051cd18","Type":"ContainerStarted","Data":"8de9cf0cc7d19423c3cb92479f282ea845709ffe47b94d4fc06a486d914c283f"} Mar 11 10:19:47 crc kubenswrapper[4830]: I0311 10:19:47.834377 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" event={"ID":"50a6aad1-7754-45c2-8c6e-ce71e051cd18","Type":"ContainerStarted","Data":"ec933199560a861914f7e825c0e95958b2df2e88752ab2aad24d44bfbe04af98"} Mar 11 10:19:47 crc kubenswrapper[4830]: I0311 10:19:47.854642 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" podStartSLOduration=1.8546242309999998 podStartE2EDuration="1.854624231s" podCreationTimestamp="2026-03-11 10:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 10:19:47.847679742 +0000 UTC m=+3955.628830461" watchObservedRunningTime="2026-03-11 10:19:47.854624231 +0000 UTC m=+3955.635774910" Mar 11 10:19:50 crc kubenswrapper[4830]: I0311 10:19:50.902970 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-r2x4x"] Mar 11 10:19:50 crc kubenswrapper[4830]: I0311 10:19:50.905111 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.041827 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd95f2f-abf1-4eb0-af3c-179906c463ab-host\") pod \"crc-debug-r2x4x\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.042212 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcvrb\" (UniqueName: \"kubernetes.io/projected/3cd95f2f-abf1-4eb0-af3c-179906c463ab-kube-api-access-bcvrb\") pod \"crc-debug-r2x4x\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.144608 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd95f2f-abf1-4eb0-af3c-179906c463ab-host\") pod \"crc-debug-r2x4x\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.144667 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcvrb\" (UniqueName: \"kubernetes.io/projected/3cd95f2f-abf1-4eb0-af3c-179906c463ab-kube-api-access-bcvrb\") pod \"crc-debug-r2x4x\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.144775 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd95f2f-abf1-4eb0-af3c-179906c463ab-host\") pod \"crc-debug-r2x4x\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.165965 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcvrb\" (UniqueName: \"kubernetes.io/projected/3cd95f2f-abf1-4eb0-af3c-179906c463ab-kube-api-access-bcvrb\") pod \"crc-debug-r2x4x\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.226756 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:19:51 crc kubenswrapper[4830]: W0311 10:19:51.284553 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cd95f2f_abf1_4eb0_af3c_179906c463ab.slice/crio-dd288f84e95046e7332536ebdbe5f9bd0aed3b683355ba87750439591ee7fef3 WatchSource:0}: Error finding container dd288f84e95046e7332536ebdbe5f9bd0aed3b683355ba87750439591ee7fef3: Status 404 returned error can't find the container with id dd288f84e95046e7332536ebdbe5f9bd0aed3b683355ba87750439591ee7fef3 Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.866267 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" event={"ID":"3cd95f2f-abf1-4eb0-af3c-179906c463ab","Type":"ContainerStarted","Data":"f47c113cda4eb07caaf06fcc9df3a19fa976a4d71b3675868dda1c9fc4a34a12"} Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.866828 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" event={"ID":"3cd95f2f-abf1-4eb0-af3c-179906c463ab","Type":"ContainerStarted","Data":"dd288f84e95046e7332536ebdbe5f9bd0aed3b683355ba87750439591ee7fef3"} Mar 11 10:19:51 crc kubenswrapper[4830]: I0311 10:19:51.891726 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" podStartSLOduration=1.891703284 podStartE2EDuration="1.891703284s" podCreationTimestamp="2026-03-11 10:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-11 10:19:51.883170723 +0000 UTC m=+3959.664321432" watchObservedRunningTime="2026-03-11 10:19:51.891703284 +0000 UTC m=+3959.672853993" Mar 11 10:19:53 crc kubenswrapper[4830]: I0311 10:19:53.933209 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:19:54 crc kubenswrapper[4830]: I0311 10:19:54.902963 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"faa8f29715aee87d768668d7a8b45badc0d9d0b5b2bf527a7d66e539bd3a1912"} Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.142757 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553740-zlmdr"] Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.144905 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553740-zlmdr" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.147214 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.147437 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.151814 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553740-zlmdr"] Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.152123 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.217289 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/ec1dd9ac-f029-479e-9e4f-f81d1810d084-kube-api-access-dx2c7\") pod \"auto-csr-approver-29553740-zlmdr\" (UID: \"ec1dd9ac-f029-479e-9e4f-f81d1810d084\") " pod="openshift-infra/auto-csr-approver-29553740-zlmdr" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.319232 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/ec1dd9ac-f029-479e-9e4f-f81d1810d084-kube-api-access-dx2c7\") pod \"auto-csr-approver-29553740-zlmdr\" (UID: \"ec1dd9ac-f029-479e-9e4f-f81d1810d084\") " pod="openshift-infra/auto-csr-approver-29553740-zlmdr" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.341764 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/ec1dd9ac-f029-479e-9e4f-f81d1810d084-kube-api-access-dx2c7\") pod \"auto-csr-approver-29553740-zlmdr\" (UID: \"ec1dd9ac-f029-479e-9e4f-f81d1810d084\") " pod="openshift-infra/auto-csr-approver-29553740-zlmdr" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.463030 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553740-zlmdr" Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.988464 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553740-zlmdr"] Mar 11 10:20:00 crc kubenswrapper[4830]: I0311 10:20:00.989370 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 10:20:01 crc kubenswrapper[4830]: I0311 10:20:01.960967 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553740-zlmdr" event={"ID":"ec1dd9ac-f029-479e-9e4f-f81d1810d084","Type":"ContainerStarted","Data":"f9cb92505f92929e646c839408879b8e601ff6a67749eb67d2303ba5f16a8122"} Mar 11 10:20:02 crc kubenswrapper[4830]: I0311 10:20:02.971484 4830 generic.go:334] "Generic (PLEG): container finished" podID="ec1dd9ac-f029-479e-9e4f-f81d1810d084" containerID="ce4edcc2b2ad947a2e21dc3e1aa6d69c4a16c087b530169b3716edffd9af53c0" exitCode=0 Mar 11 10:20:02 crc kubenswrapper[4830]: I0311 10:20:02.971590 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553740-zlmdr" event={"ID":"ec1dd9ac-f029-479e-9e4f-f81d1810d084","Type":"ContainerDied","Data":"ce4edcc2b2ad947a2e21dc3e1aa6d69c4a16c087b530169b3716edffd9af53c0"} Mar 11 10:20:04 crc kubenswrapper[4830]: I0311 10:20:04.405052 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553740-zlmdr" Mar 11 10:20:04 crc kubenswrapper[4830]: I0311 10:20:04.507699 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/ec1dd9ac-f029-479e-9e4f-f81d1810d084-kube-api-access-dx2c7\") pod \"ec1dd9ac-f029-479e-9e4f-f81d1810d084\" (UID: \"ec1dd9ac-f029-479e-9e4f-f81d1810d084\") " Mar 11 10:20:04 crc kubenswrapper[4830]: I0311 10:20:04.531439 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1dd9ac-f029-479e-9e4f-f81d1810d084-kube-api-access-dx2c7" (OuterVolumeSpecName: "kube-api-access-dx2c7") pod "ec1dd9ac-f029-479e-9e4f-f81d1810d084" (UID: "ec1dd9ac-f029-479e-9e4f-f81d1810d084"). InnerVolumeSpecName "kube-api-access-dx2c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:20:04 crc kubenswrapper[4830]: I0311 10:20:04.610285 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx2c7\" (UniqueName: \"kubernetes.io/projected/ec1dd9ac-f029-479e-9e4f-f81d1810d084-kube-api-access-dx2c7\") on node \"crc\" DevicePath \"\"" Mar 11 10:20:04 crc kubenswrapper[4830]: I0311 10:20:04.990570 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553740-zlmdr" event={"ID":"ec1dd9ac-f029-479e-9e4f-f81d1810d084","Type":"ContainerDied","Data":"f9cb92505f92929e646c839408879b8e601ff6a67749eb67d2303ba5f16a8122"} Mar 11 10:20:04 crc kubenswrapper[4830]: I0311 10:20:04.990647 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9cb92505f92929e646c839408879b8e601ff6a67749eb67d2303ba5f16a8122" Mar 11 10:20:04 crc kubenswrapper[4830]: I0311 10:20:04.990617 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553740-zlmdr" Mar 11 10:20:05 crc kubenswrapper[4830]: I0311 10:20:05.541467 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553734-qdrg7"] Mar 11 10:20:05 crc kubenswrapper[4830]: I0311 10:20:05.560648 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553734-qdrg7"] Mar 11 10:20:06 crc kubenswrapper[4830]: I0311 10:20:06.943938 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d6322f-95db-4f57-bbc6-1dfd2a023536" path="/var/lib/kubelet/pods/c2d6322f-95db-4f57-bbc6-1dfd2a023536/volumes" Mar 11 10:20:14 crc kubenswrapper[4830]: I0311 10:20:14.275924 4830 scope.go:117] "RemoveContainer" containerID="8b80bc7215e2511c0f59c2b82bdbe8dbacf517badbe2330f90b029366c7c373a" Mar 11 10:20:27 crc kubenswrapper[4830]: I0311 10:20:27.213738 4830 generic.go:334] "Generic (PLEG): container finished" podID="3cd95f2f-abf1-4eb0-af3c-179906c463ab" containerID="f47c113cda4eb07caaf06fcc9df3a19fa976a4d71b3675868dda1c9fc4a34a12" exitCode=0 Mar 11 10:20:27 crc kubenswrapper[4830]: I0311 10:20:27.214067 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" event={"ID":"3cd95f2f-abf1-4eb0-af3c-179906c463ab","Type":"ContainerDied","Data":"f47c113cda4eb07caaf06fcc9df3a19fa976a4d71b3675868dda1c9fc4a34a12"} Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.335413 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.370199 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-r2x4x"] Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.378183 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-r2x4x"] Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.458632 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd95f2f-abf1-4eb0-af3c-179906c463ab-host\") pod \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.458698 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cd95f2f-abf1-4eb0-af3c-179906c463ab-host" (OuterVolumeSpecName: "host") pod "3cd95f2f-abf1-4eb0-af3c-179906c463ab" (UID: "3cd95f2f-abf1-4eb0-af3c-179906c463ab"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.459496 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcvrb\" (UniqueName: \"kubernetes.io/projected/3cd95f2f-abf1-4eb0-af3c-179906c463ab-kube-api-access-bcvrb\") pod \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\" (UID: \"3cd95f2f-abf1-4eb0-af3c-179906c463ab\") " Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.460114 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3cd95f2f-abf1-4eb0-af3c-179906c463ab-host\") on node \"crc\" DevicePath \"\"" Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.466681 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd95f2f-abf1-4eb0-af3c-179906c463ab-kube-api-access-bcvrb" (OuterVolumeSpecName: "kube-api-access-bcvrb") pod "3cd95f2f-abf1-4eb0-af3c-179906c463ab" (UID: "3cd95f2f-abf1-4eb0-af3c-179906c463ab"). InnerVolumeSpecName "kube-api-access-bcvrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.562397 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcvrb\" (UniqueName: \"kubernetes.io/projected/3cd95f2f-abf1-4eb0-af3c-179906c463ab-kube-api-access-bcvrb\") on node \"crc\" DevicePath \"\"" Mar 11 10:20:28 crc kubenswrapper[4830]: I0311 10:20:28.943685 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd95f2f-abf1-4eb0-af3c-179906c463ab" path="/var/lib/kubelet/pods/3cd95f2f-abf1-4eb0-af3c-179906c463ab/volumes" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.232365 4830 scope.go:117] "RemoveContainer" containerID="f47c113cda4eb07caaf06fcc9df3a19fa976a4d71b3675868dda1c9fc4a34a12" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.232469 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-r2x4x" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.546921 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-8rhbl"] Mar 11 10:20:29 crc kubenswrapper[4830]: E0311 10:20:29.547687 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd95f2f-abf1-4eb0-af3c-179906c463ab" containerName="container-00" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.547704 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd95f2f-abf1-4eb0-af3c-179906c463ab" containerName="container-00" Mar 11 10:20:29 crc kubenswrapper[4830]: E0311 10:20:29.547724 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1dd9ac-f029-479e-9e4f-f81d1810d084" containerName="oc" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.547732 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1dd9ac-f029-479e-9e4f-f81d1810d084" containerName="oc" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.547958 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1dd9ac-f029-479e-9e4f-f81d1810d084" containerName="oc" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.547981 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd95f2f-abf1-4eb0-af3c-179906c463ab" containerName="container-00" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.548777 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.583954 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-host\") pod \"crc-debug-8rhbl\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.584248 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt55l\" (UniqueName: \"kubernetes.io/projected/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-kube-api-access-zt55l\") pod \"crc-debug-8rhbl\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.686437 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt55l\" (UniqueName: \"kubernetes.io/projected/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-kube-api-access-zt55l\") pod \"crc-debug-8rhbl\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.686596 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-host\") pod \"crc-debug-8rhbl\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.686703 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-host\") pod \"crc-debug-8rhbl\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.702955 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt55l\" (UniqueName: \"kubernetes.io/projected/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-kube-api-access-zt55l\") pod \"crc-debug-8rhbl\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: I0311 10:20:29.871416 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:29 crc kubenswrapper[4830]: W0311 10:20:29.896957 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda37a3d21_340c_48e8_bc89_f0fb0edd9c93.slice/crio-3416319cc193c78b37eb2916ba6c7e53eff6ca2007b17ea6702bbe9002514540 WatchSource:0}: Error finding container 3416319cc193c78b37eb2916ba6c7e53eff6ca2007b17ea6702bbe9002514540: Status 404 returned error can't find the container with id 3416319cc193c78b37eb2916ba6c7e53eff6ca2007b17ea6702bbe9002514540 Mar 11 10:20:30 crc kubenswrapper[4830]: I0311 10:20:30.245133 4830 generic.go:334] "Generic (PLEG): container finished" podID="a37a3d21-340c-48e8-bc89-f0fb0edd9c93" containerID="c4d79bc5c2c76bbdbc67b42204dacc09c50623ba08da18375dc357224b349493" exitCode=0 Mar 11 10:20:30 crc kubenswrapper[4830]: I0311 10:20:30.245197 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" event={"ID":"a37a3d21-340c-48e8-bc89-f0fb0edd9c93","Type":"ContainerDied","Data":"c4d79bc5c2c76bbdbc67b42204dacc09c50623ba08da18375dc357224b349493"} Mar 11 10:20:30 crc kubenswrapper[4830]: I0311 10:20:30.245422 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" event={"ID":"a37a3d21-340c-48e8-bc89-f0fb0edd9c93","Type":"ContainerStarted","Data":"3416319cc193c78b37eb2916ba6c7e53eff6ca2007b17ea6702bbe9002514540"} Mar 11 10:20:30 crc kubenswrapper[4830]: I0311 10:20:30.644630 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-8rhbl"] Mar 11 10:20:30 crc kubenswrapper[4830]: I0311 10:20:30.652766 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-8rhbl"] Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.373537 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.427328 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-host\") pod \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.427423 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt55l\" (UniqueName: \"kubernetes.io/projected/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-kube-api-access-zt55l\") pod \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\" (UID: \"a37a3d21-340c-48e8-bc89-f0fb0edd9c93\") " Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.427494 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-host" (OuterVolumeSpecName: "host") pod "a37a3d21-340c-48e8-bc89-f0fb0edd9c93" (UID: "a37a3d21-340c-48e8-bc89-f0fb0edd9c93"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.428013 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-host\") on node \"crc\" DevicePath \"\"" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.433206 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-kube-api-access-zt55l" (OuterVolumeSpecName: "kube-api-access-zt55l") pod "a37a3d21-340c-48e8-bc89-f0fb0edd9c93" (UID: "a37a3d21-340c-48e8-bc89-f0fb0edd9c93"). InnerVolumeSpecName "kube-api-access-zt55l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.529812 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt55l\" (UniqueName: \"kubernetes.io/projected/a37a3d21-340c-48e8-bc89-f0fb0edd9c93-kube-api-access-zt55l\") on node \"crc\" DevicePath \"\"" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.910071 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-q5cbz"] Mar 11 10:20:31 crc kubenswrapper[4830]: E0311 10:20:31.910736 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37a3d21-340c-48e8-bc89-f0fb0edd9c93" containerName="container-00" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.910752 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37a3d21-340c-48e8-bc89-f0fb0edd9c93" containerName="container-00" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.910932 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37a3d21-340c-48e8-bc89-f0fb0edd9c93" containerName="container-00" Mar 11 10:20:31 crc kubenswrapper[4830]: I0311 10:20:31.911569 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.039725 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a9deb9d-d5db-411e-9e35-fa2abc938777-host\") pod \"crc-debug-q5cbz\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.039818 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkww\" (UniqueName: \"kubernetes.io/projected/4a9deb9d-d5db-411e-9e35-fa2abc938777-kube-api-access-wdkww\") pod \"crc-debug-q5cbz\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.141725 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a9deb9d-d5db-411e-9e35-fa2abc938777-host\") pod \"crc-debug-q5cbz\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.141800 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkww\" (UniqueName: \"kubernetes.io/projected/4a9deb9d-d5db-411e-9e35-fa2abc938777-kube-api-access-wdkww\") pod \"crc-debug-q5cbz\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.141888 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a9deb9d-d5db-411e-9e35-fa2abc938777-host\") pod \"crc-debug-q5cbz\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.161785 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkww\" (UniqueName: \"kubernetes.io/projected/4a9deb9d-d5db-411e-9e35-fa2abc938777-kube-api-access-wdkww\") pod \"crc-debug-q5cbz\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.229256 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.267554 4830 scope.go:117] "RemoveContainer" containerID="c4d79bc5c2c76bbdbc67b42204dacc09c50623ba08da18375dc357224b349493" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.267685 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-8rhbl" Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.270576 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" event={"ID":"4a9deb9d-d5db-411e-9e35-fa2abc938777","Type":"ContainerStarted","Data":"55fd4f448610186dbe6dd4a02fae74e072de9dc16048a941db7f104a2e4aef1a"} Mar 11 10:20:32 crc kubenswrapper[4830]: I0311 10:20:32.947464 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37a3d21-340c-48e8-bc89-f0fb0edd9c93" path="/var/lib/kubelet/pods/a37a3d21-340c-48e8-bc89-f0fb0edd9c93/volumes" Mar 11 10:20:33 crc kubenswrapper[4830]: I0311 10:20:33.279784 4830 generic.go:334] "Generic (PLEG): container finished" podID="4a9deb9d-d5db-411e-9e35-fa2abc938777" containerID="a61ffd02de8776306c3591b54b837085eb51c6c28d2e278fc7fb39d3e834eade" exitCode=0 Mar 11 10:20:33 crc kubenswrapper[4830]: I0311 10:20:33.279841 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" event={"ID":"4a9deb9d-d5db-411e-9e35-fa2abc938777","Type":"ContainerDied","Data":"a61ffd02de8776306c3591b54b837085eb51c6c28d2e278fc7fb39d3e834eade"} Mar 11 10:20:33 crc kubenswrapper[4830]: I0311 10:20:33.323626 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-q5cbz"] Mar 11 10:20:33 crc kubenswrapper[4830]: I0311 10:20:33.332325 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlpqd/crc-debug-q5cbz"] Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.397985 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.486569 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdkww\" (UniqueName: \"kubernetes.io/projected/4a9deb9d-d5db-411e-9e35-fa2abc938777-kube-api-access-wdkww\") pod \"4a9deb9d-d5db-411e-9e35-fa2abc938777\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.486954 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a9deb9d-d5db-411e-9e35-fa2abc938777-host\") pod \"4a9deb9d-d5db-411e-9e35-fa2abc938777\" (UID: \"4a9deb9d-d5db-411e-9e35-fa2abc938777\") " Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.487053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a9deb9d-d5db-411e-9e35-fa2abc938777-host" (OuterVolumeSpecName: "host") pod "4a9deb9d-d5db-411e-9e35-fa2abc938777" (UID: "4a9deb9d-d5db-411e-9e35-fa2abc938777"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.487722 4830 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4a9deb9d-d5db-411e-9e35-fa2abc938777-host\") on node \"crc\" DevicePath \"\"" Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.493064 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9deb9d-d5db-411e-9e35-fa2abc938777-kube-api-access-wdkww" (OuterVolumeSpecName: "kube-api-access-wdkww") pod "4a9deb9d-d5db-411e-9e35-fa2abc938777" (UID: "4a9deb9d-d5db-411e-9e35-fa2abc938777"). InnerVolumeSpecName "kube-api-access-wdkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.589626 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdkww\" (UniqueName: \"kubernetes.io/projected/4a9deb9d-d5db-411e-9e35-fa2abc938777-kube-api-access-wdkww\") on node \"crc\" DevicePath \"\"" Mar 11 10:20:34 crc kubenswrapper[4830]: I0311 10:20:34.944801 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9deb9d-d5db-411e-9e35-fa2abc938777" path="/var/lib/kubelet/pods/4a9deb9d-d5db-411e-9e35-fa2abc938777/volumes" Mar 11 10:20:35 crc kubenswrapper[4830]: I0311 10:20:35.305074 4830 scope.go:117] "RemoveContainer" containerID="a61ffd02de8776306c3591b54b837085eb51c6c28d2e278fc7fb39d3e834eade" Mar 11 10:20:35 crc kubenswrapper[4830]: I0311 10:20:35.305136 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/crc-debug-q5cbz" Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.930842 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5r5x5"] Mar 11 10:21:03 crc kubenswrapper[4830]: E0311 10:21:03.931808 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9deb9d-d5db-411e-9e35-fa2abc938777" containerName="container-00" Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.931823 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9deb9d-d5db-411e-9e35-fa2abc938777" containerName="container-00" Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.931994 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9deb9d-d5db-411e-9e35-fa2abc938777" containerName="container-00" Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.933728 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.941041 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5r5x5"] Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.945375 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-catalog-content\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.945648 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbmn\" (UniqueName: \"kubernetes.io/projected/b15c3721-d6b8-4724-a7d3-429a5123201a-kube-api-access-xrbmn\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:03 crc kubenswrapper[4830]: I0311 10:21:03.945981 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-utilities\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.047809 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-catalog-content\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.047917 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbmn\" (UniqueName: \"kubernetes.io/projected/b15c3721-d6b8-4724-a7d3-429a5123201a-kube-api-access-xrbmn\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.048045 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-utilities\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.048683 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-utilities\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.048756 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-catalog-content\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.069381 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbmn\" (UniqueName: \"kubernetes.io/projected/b15c3721-d6b8-4724-a7d3-429a5123201a-kube-api-access-xrbmn\") pod \"redhat-operators-5r5x5\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.256993 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:04 crc kubenswrapper[4830]: I0311 10:21:04.715303 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5r5x5"] Mar 11 10:21:05 crc kubenswrapper[4830]: I0311 10:21:05.580749 4830 generic.go:334] "Generic (PLEG): container finished" podID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerID="93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e" exitCode=0 Mar 11 10:21:05 crc kubenswrapper[4830]: I0311 10:21:05.580843 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5r5x5" event={"ID":"b15c3721-d6b8-4724-a7d3-429a5123201a","Type":"ContainerDied","Data":"93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e"} Mar 11 10:21:05 crc kubenswrapper[4830]: I0311 10:21:05.581298 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5r5x5" event={"ID":"b15c3721-d6b8-4724-a7d3-429a5123201a","Type":"ContainerStarted","Data":"d71f866da09f890ac77676b1a5049d214991e8444fbaaafc8eb5c2985282d881"} Mar 11 10:21:06 crc kubenswrapper[4830]: I0311 10:21:06.169970 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7888bcc99b-t8slf_bbe664eb-daf0-4aeb-ae09-f47b2204bdf1/barbican-api/0.log" Mar 11 10:21:06 crc kubenswrapper[4830]: I0311 10:21:06.364837 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7888bcc99b-t8slf_bbe664eb-daf0-4aeb-ae09-f47b2204bdf1/barbican-api-log/0.log" Mar 11 10:21:06 crc kubenswrapper[4830]: I0311 10:21:06.657067 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c976ddb9d-ppssd_98ddc718-e67e-406f-aae3-03680232691b/barbican-keystone-listener-log/0.log" Mar 11 10:21:06 crc kubenswrapper[4830]: I0311 10:21:06.666310 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7c976ddb9d-ppssd_98ddc718-e67e-406f-aae3-03680232691b/barbican-keystone-listener/0.log" Mar 11 10:21:06 crc kubenswrapper[4830]: I0311 10:21:06.831504 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bddfb9bc9-6hzsp_b573b144-d9a4-4ea5-8b28-d9e4e3ed6274/barbican-worker/0.log" Mar 11 10:21:06 crc kubenswrapper[4830]: I0311 10:21:06.955614 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bddfb9bc9-6hzsp_b573b144-d9a4-4ea5-8b28-d9e4e3ed6274/barbican-worker-log/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.036332 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-m7b5d_751133b2-5530-48d1-9cb0-4e69aadf979a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.219467 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/ceilometer-notification-agent/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.257262 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/ceilometer-central-agent/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.293400 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/proxy-httpd/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.333751 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9a97be49-616f-4338-b04a-9928016b4c26/sg-core/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.506816 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f96f1f82-873d-4665-8273-65bfc41ba374/cinder-api/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.518782 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f96f1f82-873d-4665-8273-65bfc41ba374/cinder-api-log/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.602219 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5r5x5" event={"ID":"b15c3721-d6b8-4724-a7d3-429a5123201a","Type":"ContainerStarted","Data":"7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0"} Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.783477 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_65554cb9-6d98-4e70-8feb-73029d8184dc/probe/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.818952 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_65554cb9-6d98-4e70-8feb-73029d8184dc/cinder-scheduler/0.log" Mar 11 10:21:07 crc kubenswrapper[4830]: I0311 10:21:07.904475 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-d6nzx_9711332d-adac-4289-81e4-686135601f68/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.051929 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-gpfl2_aacf9f52-24a2-462c-8957-3fb5c88988d3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.127941 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-cms8z_e6b1a549-c16a-4efe-83df-800de8dbdac2/init/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.295727 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-cms8z_e6b1a549-c16a-4efe-83df-800de8dbdac2/init/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.340144 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8c6f6df99-cms8z_e6b1a549-c16a-4efe-83df-800de8dbdac2/dnsmasq-dns/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.346058 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mz4j5_fae734f9-b26d-4252-b943-b09b3e235cfa/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.547882 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c8947c5-6c54-4acb-9100-3c5ea0988770/glance-httpd/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.603837 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4c8947c5-6c54-4acb-9100-3c5ea0988770/glance-log/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.720317 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6069e67-6f76-4a02-9c90-d1ac74d8aaca/glance-httpd/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.811441 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a6069e67-6f76-4a02-9c90-d1ac74d8aaca/glance-log/0.log" Mar 11 10:21:08 crc kubenswrapper[4830]: I0311 10:21:08.939461 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-789dc4b6cd-xz7ds_77e86c78-b565-4e6c-8867-519fa2d5137a/horizon/0.log" Mar 11 10:21:09 crc kubenswrapper[4830]: I0311 10:21:09.160696 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-672vg_13dfb6c4-9546-4a13-bc42-842a71c96c6c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:09 crc kubenswrapper[4830]: I0311 10:21:09.379112 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jx2k4_5132dffb-d28b-494f-891d-ea13b54a5a72/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:09 crc kubenswrapper[4830]: I0311 10:21:09.419485 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-789dc4b6cd-xz7ds_77e86c78-b565-4e6c-8867-519fa2d5137a/horizon-log/0.log" Mar 11 10:21:09 crc kubenswrapper[4830]: I0311 10:21:09.600562 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-559977bfdc-r7ssx_3d8403ac-71e1-41f2-a897-bf61055308f6/keystone-api/0.log" Mar 11 10:21:09 crc kubenswrapper[4830]: I0311 10:21:09.645411 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29553721-rtbwn_04ab5666-8c5d-4e96-9c47-502bdc63bafb/keystone-cron/0.log" Mar 11 10:21:09 crc kubenswrapper[4830]: I0311 10:21:09.836539 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e85b49bc-4607-4852-9fce-dcf43af1069f/kube-state-metrics/0.log" Mar 11 10:21:09 crc kubenswrapper[4830]: I0311 10:21:09.889041 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7m4gq_bfdc6f64-813a-4a57-a123-b4d15c6ae569/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:10 crc kubenswrapper[4830]: I0311 10:21:10.325948 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c74bff69-478cc_55875815-4467-4c5e-8401-b220cb1694c6/neutron-api/0.log" Mar 11 10:21:10 crc kubenswrapper[4830]: I0311 10:21:10.335789 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54c74bff69-478cc_55875815-4467-4c5e-8401-b220cb1694c6/neutron-httpd/0.log" Mar 11 10:21:10 crc kubenswrapper[4830]: I0311 10:21:10.594326 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-92l85_48be5fba-f61d-4475-bbaf-df6ece9da972/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:11 crc kubenswrapper[4830]: I0311 10:21:11.012456 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d3eb0127-a012-4cbf-8768-84e20518f316/nova-api-log/0.log" Mar 11 10:21:11 crc kubenswrapper[4830]: I0311 10:21:11.072424 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_efc74c44-bbbb-4dd9-b762-f7c483d0e336/nova-cell0-conductor-conductor/0.log" Mar 11 10:21:11 crc kubenswrapper[4830]: I0311 10:21:11.332475 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d3eb0127-a012-4cbf-8768-84e20518f316/nova-api-api/0.log" Mar 11 10:21:11 crc kubenswrapper[4830]: I0311 10:21:11.407348 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e6bc7c45-10c9-4571-923a-4fb4b861657e/nova-cell1-conductor-conductor/0.log" Mar 11 10:21:11 crc kubenswrapper[4830]: I0311 10:21:11.507939 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f690df7f-ca68-4a5d-8e9e-4d7d55df4773/nova-cell1-novncproxy-novncproxy/0.log" Mar 11 10:21:11 crc kubenswrapper[4830]: I0311 10:21:11.584864 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-79mrm_df44fa5f-956c-47f8-af60-49a95e1c6da1/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:11 crc kubenswrapper[4830]: I0311 10:21:11.833411 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_72993026-e5ee-42ee-9381-36ec25d1d1d0/nova-metadata-log/0.log" Mar 11 10:21:12 crc kubenswrapper[4830]: I0311 10:21:12.166217 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ce0c7bf-830f-40f3-850f-19b0a879ba23/mysql-bootstrap/0.log" Mar 11 10:21:12 crc kubenswrapper[4830]: I0311 10:21:12.235911 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c53770ee-2b8a-4e7a-a59c-bc09739ce4e5/nova-scheduler-scheduler/0.log" Mar 11 10:21:12 crc kubenswrapper[4830]: I0311 10:21:12.405155 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ce0c7bf-830f-40f3-850f-19b0a879ba23/galera/0.log" Mar 11 10:21:12 crc kubenswrapper[4830]: I0311 10:21:12.520882 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9ce0c7bf-830f-40f3-850f-19b0a879ba23/mysql-bootstrap/0.log" Mar 11 10:21:12 crc kubenswrapper[4830]: I0311 10:21:12.630604 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ad5f765-f3dd-42f3-9829-2323ea982c58/mysql-bootstrap/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.207445 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_72993026-e5ee-42ee-9381-36ec25d1d1d0/nova-metadata-metadata/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.373675 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ad5f765-f3dd-42f3-9829-2323ea982c58/mysql-bootstrap/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.396290 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c113279e-3264-4a62-8c50-5ddb2be700bb/openstackclient/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.488968 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7ad5f765-f3dd-42f3-9829-2323ea982c58/galera/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.663563 4830 generic.go:334] "Generic (PLEG): container finished" podID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerID="7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0" exitCode=0 Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.663613 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5r5x5" event={"ID":"b15c3721-d6b8-4724-a7d3-429a5123201a","Type":"ContainerDied","Data":"7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0"} Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.676746 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-dp7ql_1a868478-8050-4c0f-a7f4-d6dcc82f9832/openstack-network-exporter/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.722863 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovsdb-server-init/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.915097 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovsdb-server-init/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.964816 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovsdb-server/0.log" Mar 11 10:21:13 crc kubenswrapper[4830]: I0311 10:21:13.985846 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-klr5s_97278c10-fe96-4de6-86cf-09ff64444a59/ovs-vswitchd/0.log" Mar 11 10:21:14 crc kubenswrapper[4830]: I0311 10:21:14.189240 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xjsks_d97948cc-fc42-46c8-b46e-3f8efdc251db/ovn-controller/0.log" Mar 11 10:21:14 crc kubenswrapper[4830]: I0311 10:21:14.250510 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-kqw9x_ae8bba5a-cb5f-402a-a57b-d1cc1ad226e3/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:14 crc kubenswrapper[4830]: I0311 10:21:14.476501 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0/openstack-network-exporter/0.log" Mar 11 10:21:14 crc kubenswrapper[4830]: I0311 10:21:14.513991 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af5dcb58-54a3-4ca2-a5b5-0a04bea6e6e0/ovn-northd/0.log" Mar 11 10:21:14 crc kubenswrapper[4830]: I0311 10:21:14.622420 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a1424a67-6a71-4943-b855-4795d2427214/openstack-network-exporter/0.log" Mar 11 10:21:14 crc kubenswrapper[4830]: I0311 10:21:14.673791 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5r5x5" event={"ID":"b15c3721-d6b8-4724-a7d3-429a5123201a","Type":"ContainerStarted","Data":"a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a"} Mar 11 10:21:14 crc kubenswrapper[4830]: I0311 10:21:14.695701 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5r5x5" podStartSLOduration=3.227752887 podStartE2EDuration="11.695681266s" podCreationTimestamp="2026-03-11 10:21:03 +0000 UTC" firstStartedPulling="2026-03-11 10:21:05.583065955 +0000 UTC m=+4033.364216644" lastFinishedPulling="2026-03-11 10:21:14.050994334 +0000 UTC m=+4041.832145023" observedRunningTime="2026-03-11 10:21:14.688864361 +0000 UTC m=+4042.470015040" watchObservedRunningTime="2026-03-11 10:21:14.695681266 +0000 UTC m=+4042.476831955" Mar 11 10:21:15 crc kubenswrapper[4830]: I0311 10:21:15.334910 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8f9d7ab5-467c-4888-8759-6e2ef59957e5/openstack-network-exporter/0.log" Mar 11 10:21:15 crc kubenswrapper[4830]: I0311 10:21:15.335289 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8f9d7ab5-467c-4888-8759-6e2ef59957e5/ovsdbserver-sb/0.log" Mar 11 10:21:15 crc kubenswrapper[4830]: I0311 10:21:15.352518 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a1424a67-6a71-4943-b855-4795d2427214/ovsdbserver-nb/0.log" Mar 11 10:21:15 crc kubenswrapper[4830]: I0311 10:21:15.635306 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599b4448-86g7s_da8c023e-1cf1-4a06-8c20-2b79612f7ae8/placement-api/0.log" Mar 11 10:21:15 crc kubenswrapper[4830]: I0311 10:21:15.698519 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-599b4448-86g7s_da8c023e-1cf1-4a06-8c20-2b79612f7ae8/placement-log/0.log" Mar 11 10:21:15 crc kubenswrapper[4830]: I0311 10:21:15.845400 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5da20462-be2b-466c-9c04-17b6a0a94572/setup-container/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.123861 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e0f47113-88e8-4b57-b9df-1ff8b05cde01/setup-container/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.172647 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5da20462-be2b-466c-9c04-17b6a0a94572/rabbitmq/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.213748 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5da20462-be2b-466c-9c04-17b6a0a94572/setup-container/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.410457 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e0f47113-88e8-4b57-b9df-1ff8b05cde01/rabbitmq/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.508479 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e0f47113-88e8-4b57-b9df-1ff8b05cde01/setup-container/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.535987 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6tz56_592c8d08-ac0e-4665-9d65-e362412b7867/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.803473 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-mrqxc_2b0b1934-6dd3-441c-923d-67b9ed28a177/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:16 crc kubenswrapper[4830]: I0311 10:21:16.890230 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-wvdgs_d57e6a98-80e8-40a0-af5d-56d936e6ab67/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.017996 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xf8b7_9ae7bc18-6614-4094-961f-9590aa0346f4/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.125176 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-swd9t_21e999cb-ceca-44f0-a7e8-cf0d801e84a7/ssh-known-hosts-edpm-deployment/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.377760 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f56859c77-bnqc2_fcbbcfad-16c7-4040-9b06-b2ff9f4c5666/proxy-server/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.468266 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f56859c77-bnqc2_fcbbcfad-16c7-4040-9b06-b2ff9f4c5666/proxy-httpd/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.540247 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jmd8g_9e52669c-56df-4791-84e7-4d4bd34e420f/swift-ring-rebalance/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.715558 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-reaper/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.745598 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-auditor/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.890337 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-replicator/0.log" Mar 11 10:21:17 crc kubenswrapper[4830]: I0311 10:21:17.985907 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/account-server/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.022934 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-auditor/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.106391 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-replicator/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.132007 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-server/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.224032 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/container-updater/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.320088 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-auditor/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.394193 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-expirer/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.413388 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-replicator/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.593273 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-server/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.677719 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/object-updater/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.802860 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/rsync/0.log" Mar 11 10:21:18 crc kubenswrapper[4830]: I0311 10:21:18.998168 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_db4bedf3-ea20-4a63-9623-96286e9b243b/swift-recon-cron/0.log" Mar 11 10:21:19 crc kubenswrapper[4830]: I0311 10:21:19.050664 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-sjsn4_d6daac1f-f36f-42a1-9735-1b182e03052e/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:19 crc kubenswrapper[4830]: I0311 10:21:19.262903 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e3b4c71a-e1ca-48e5-9fdf-6310a8af11d3/tempest-tests-tempest-tests-runner/0.log" Mar 11 10:21:19 crc kubenswrapper[4830]: I0311 10:21:19.412290 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c01f3e65-5b46-4373-a420-2d966d66a081/test-operator-logs-container/0.log" Mar 11 10:21:19 crc kubenswrapper[4830]: I0311 10:21:19.557609 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xqlv8_4d5247a6-36f4-4260-88bd-659f66f5efc0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 11 10:21:24 crc kubenswrapper[4830]: I0311 10:21:24.257124 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:24 crc kubenswrapper[4830]: I0311 10:21:24.263295 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:24 crc kubenswrapper[4830]: I0311 10:21:24.324172 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:25 crc kubenswrapper[4830]: I0311 10:21:25.131226 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:25 crc kubenswrapper[4830]: I0311 10:21:25.178202 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5r5x5"] Mar 11 10:21:26 crc kubenswrapper[4830]: I0311 10:21:26.798939 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5r5x5" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="registry-server" containerID="cri-o://a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a" gracePeriod=2 Mar 11 10:21:26 crc kubenswrapper[4830]: I0311 10:21:26.991443 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c5cnh"] Mar 11 10:21:26 crc kubenswrapper[4830]: I0311 10:21:26.995072 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.001986 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-utilities\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.002065 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-catalog-content\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.002141 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ldlp\" (UniqueName: \"kubernetes.io/projected/59352d4d-febc-46f3-9339-2a5a4876dcff-kube-api-access-9ldlp\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.038085 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5cnh"] Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.107306 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-utilities\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.107374 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-catalog-content\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.107489 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ldlp\" (UniqueName: \"kubernetes.io/projected/59352d4d-febc-46f3-9339-2a5a4876dcff-kube-api-access-9ldlp\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.108457 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-utilities\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.108758 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-catalog-content\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.164499 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ldlp\" (UniqueName: \"kubernetes.io/projected/59352d4d-febc-46f3-9339-2a5a4876dcff-kube-api-access-9ldlp\") pod \"redhat-marketplace-c5cnh\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.336559 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.487518 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.524345 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrbmn\" (UniqueName: \"kubernetes.io/projected/b15c3721-d6b8-4724-a7d3-429a5123201a-kube-api-access-xrbmn\") pod \"b15c3721-d6b8-4724-a7d3-429a5123201a\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.524481 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-catalog-content\") pod \"b15c3721-d6b8-4724-a7d3-429a5123201a\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.524514 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-utilities\") pod \"b15c3721-d6b8-4724-a7d3-429a5123201a\" (UID: \"b15c3721-d6b8-4724-a7d3-429a5123201a\") " Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.526643 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-utilities" (OuterVolumeSpecName: "utilities") pod "b15c3721-d6b8-4724-a7d3-429a5123201a" (UID: "b15c3721-d6b8-4724-a7d3-429a5123201a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.534280 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15c3721-d6b8-4724-a7d3-429a5123201a-kube-api-access-xrbmn" (OuterVolumeSpecName: "kube-api-access-xrbmn") pod "b15c3721-d6b8-4724-a7d3-429a5123201a" (UID: "b15c3721-d6b8-4724-a7d3-429a5123201a"). InnerVolumeSpecName "kube-api-access-xrbmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.627725 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrbmn\" (UniqueName: \"kubernetes.io/projected/b15c3721-d6b8-4724-a7d3-429a5123201a-kube-api-access-xrbmn\") on node \"crc\" DevicePath \"\"" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.628005 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.749181 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b15c3721-d6b8-4724-a7d3-429a5123201a" (UID: "b15c3721-d6b8-4724-a7d3-429a5123201a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.812096 4830 generic.go:334] "Generic (PLEG): container finished" podID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerID="a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a" exitCode=0 Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.812142 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5r5x5" event={"ID":"b15c3721-d6b8-4724-a7d3-429a5123201a","Type":"ContainerDied","Data":"a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a"} Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.812170 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5r5x5" event={"ID":"b15c3721-d6b8-4724-a7d3-429a5123201a","Type":"ContainerDied","Data":"d71f866da09f890ac77676b1a5049d214991e8444fbaaafc8eb5c2985282d881"} Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.812186 4830 scope.go:117] "RemoveContainer" containerID="a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.812669 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5r5x5" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.835547 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b15c3721-d6b8-4724-a7d3-429a5123201a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.871173 4830 scope.go:117] "RemoveContainer" containerID="7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.889132 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5r5x5"] Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.906250 4830 scope.go:117] "RemoveContainer" containerID="93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e" Mar 11 10:21:27 crc kubenswrapper[4830]: I0311 10:21:27.960067 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5r5x5"] Mar 11 10:21:27 crc kubenswrapper[4830]: W0311 10:21:27.980248 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59352d4d_febc_46f3_9339_2a5a4876dcff.slice/crio-5c1223315f1b2d23e95ea04c49ec1dcfbccfa7ff39a20db7c5b000f2727fdd14 WatchSource:0}: Error finding container 5c1223315f1b2d23e95ea04c49ec1dcfbccfa7ff39a20db7c5b000f2727fdd14: Status 404 returned error can't find the container with id 5c1223315f1b2d23e95ea04c49ec1dcfbccfa7ff39a20db7c5b000f2727fdd14 Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.005452 4830 scope.go:117] "RemoveContainer" containerID="a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a" Mar 11 10:21:28 crc kubenswrapper[4830]: E0311 10:21:28.009247 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a\": container with ID starting with a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a not found: ID does not exist" containerID="a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a" Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.009282 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a"} err="failed to get container status \"a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a\": rpc error: code = NotFound desc = could not find container \"a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a\": container with ID starting with a79b267038600ba457fc8f580812f96db4d1d8723ba35812878b101781a7852a not found: ID does not exist" Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.009303 4830 scope.go:117] "RemoveContainer" containerID="7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0" Mar 11 10:21:28 crc kubenswrapper[4830]: E0311 10:21:28.011687 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0\": container with ID starting with 7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0 not found: ID does not exist" containerID="7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0" Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.011717 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0"} err="failed to get container status \"7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0\": rpc error: code = NotFound desc = could not find container \"7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0\": container with ID starting with 7c082427395f0dd6670f7d7d0581258b2b7bc3d5141fabea1348ec37e79290f0 not found: ID does not exist" Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.011733 4830 scope.go:117] "RemoveContainer" containerID="93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e" Mar 11 10:21:28 crc kubenswrapper[4830]: E0311 10:21:28.012222 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e\": container with ID starting with 93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e not found: ID does not exist" containerID="93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e" Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.012250 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e"} err="failed to get container status \"93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e\": rpc error: code = NotFound desc = could not find container \"93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e\": container with ID starting with 93e9c09947eb74f6f020d3201dfc4ed0d204da89c41e8e3123216d153b411d6e not found: ID does not exist" Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.020945 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5cnh"] Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.242011 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_60740879-ec5c-4d1f-bfd0-68ec5e8960f2/memcached/0.log" Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.822175 4830 generic.go:334] "Generic (PLEG): container finished" podID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerID="3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338" exitCode=0 Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.822273 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5cnh" event={"ID":"59352d4d-febc-46f3-9339-2a5a4876dcff","Type":"ContainerDied","Data":"3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338"} Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.822613 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5cnh" event={"ID":"59352d4d-febc-46f3-9339-2a5a4876dcff","Type":"ContainerStarted","Data":"5c1223315f1b2d23e95ea04c49ec1dcfbccfa7ff39a20db7c5b000f2727fdd14"} Mar 11 10:21:28 crc kubenswrapper[4830]: I0311 10:21:28.943034 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" path="/var/lib/kubelet/pods/b15c3721-d6b8-4724-a7d3-429a5123201a/volumes" Mar 11 10:21:29 crc kubenswrapper[4830]: I0311 10:21:29.838953 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5cnh" event={"ID":"59352d4d-febc-46f3-9339-2a5a4876dcff","Type":"ContainerStarted","Data":"a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0"} Mar 11 10:21:30 crc kubenswrapper[4830]: I0311 10:21:30.850193 4830 generic.go:334] "Generic (PLEG): container finished" podID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerID="a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0" exitCode=0 Mar 11 10:21:30 crc kubenswrapper[4830]: I0311 10:21:30.850264 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5cnh" event={"ID":"59352d4d-febc-46f3-9339-2a5a4876dcff","Type":"ContainerDied","Data":"a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0"} Mar 11 10:21:31 crc kubenswrapper[4830]: I0311 10:21:31.863497 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5cnh" event={"ID":"59352d4d-febc-46f3-9339-2a5a4876dcff","Type":"ContainerStarted","Data":"3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324"} Mar 11 10:21:31 crc kubenswrapper[4830]: I0311 10:21:31.888299 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c5cnh" podStartSLOduration=3.435531273 podStartE2EDuration="5.888279012s" podCreationTimestamp="2026-03-11 10:21:26 +0000 UTC" firstStartedPulling="2026-03-11 10:21:28.82545115 +0000 UTC m=+4056.606601839" lastFinishedPulling="2026-03-11 10:21:31.278198889 +0000 UTC m=+4059.059349578" observedRunningTime="2026-03-11 10:21:31.881812577 +0000 UTC m=+4059.662963286" watchObservedRunningTime="2026-03-11 10:21:31.888279012 +0000 UTC m=+4059.669429701" Mar 11 10:21:37 crc kubenswrapper[4830]: I0311 10:21:37.338683 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:37 crc kubenswrapper[4830]: I0311 10:21:37.339330 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:37 crc kubenswrapper[4830]: I0311 10:21:37.384220 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:37 crc kubenswrapper[4830]: I0311 10:21:37.956260 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:38 crc kubenswrapper[4830]: I0311 10:21:38.001616 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5cnh"] Mar 11 10:21:39 crc kubenswrapper[4830]: I0311 10:21:39.930637 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c5cnh" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="registry-server" containerID="cri-o://3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324" gracePeriod=2 Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.443602 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.575451 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-utilities\") pod \"59352d4d-febc-46f3-9339-2a5a4876dcff\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.575752 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ldlp\" (UniqueName: \"kubernetes.io/projected/59352d4d-febc-46f3-9339-2a5a4876dcff-kube-api-access-9ldlp\") pod \"59352d4d-febc-46f3-9339-2a5a4876dcff\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.575830 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-catalog-content\") pod \"59352d4d-febc-46f3-9339-2a5a4876dcff\" (UID: \"59352d4d-febc-46f3-9339-2a5a4876dcff\") " Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.576517 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-utilities" (OuterVolumeSpecName: "utilities") pod "59352d4d-febc-46f3-9339-2a5a4876dcff" (UID: "59352d4d-febc-46f3-9339-2a5a4876dcff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.582232 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59352d4d-febc-46f3-9339-2a5a4876dcff-kube-api-access-9ldlp" (OuterVolumeSpecName: "kube-api-access-9ldlp") pod "59352d4d-febc-46f3-9339-2a5a4876dcff" (UID: "59352d4d-febc-46f3-9339-2a5a4876dcff"). InnerVolumeSpecName "kube-api-access-9ldlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.602290 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59352d4d-febc-46f3-9339-2a5a4876dcff" (UID: "59352d4d-febc-46f3-9339-2a5a4876dcff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.677582 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.677625 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ldlp\" (UniqueName: \"kubernetes.io/projected/59352d4d-febc-46f3-9339-2a5a4876dcff-kube-api-access-9ldlp\") on node \"crc\" DevicePath \"\"" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.677634 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59352d4d-febc-46f3-9339-2a5a4876dcff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.943206 4830 generic.go:334] "Generic (PLEG): container finished" podID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerID="3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324" exitCode=0 Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.943256 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5cnh" event={"ID":"59352d4d-febc-46f3-9339-2a5a4876dcff","Type":"ContainerDied","Data":"3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324"} Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.943288 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5cnh" event={"ID":"59352d4d-febc-46f3-9339-2a5a4876dcff","Type":"ContainerDied","Data":"5c1223315f1b2d23e95ea04c49ec1dcfbccfa7ff39a20db7c5b000f2727fdd14"} Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.943311 4830 scope.go:117] "RemoveContainer" containerID="3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.943454 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5cnh" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.963889 4830 scope.go:117] "RemoveContainer" containerID="a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0" Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.991142 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5cnh"] Mar 11 10:21:40 crc kubenswrapper[4830]: I0311 10:21:40.996849 4830 scope.go:117] "RemoveContainer" containerID="3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338" Mar 11 10:21:41 crc kubenswrapper[4830]: I0311 10:21:41.000369 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5cnh"] Mar 11 10:21:41 crc kubenswrapper[4830]: I0311 10:21:41.044600 4830 scope.go:117] "RemoveContainer" containerID="3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324" Mar 11 10:21:41 crc kubenswrapper[4830]: E0311 10:21:41.045260 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324\": container with ID starting with 3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324 not found: ID does not exist" containerID="3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324" Mar 11 10:21:41 crc kubenswrapper[4830]: I0311 10:21:41.045313 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324"} err="failed to get container status \"3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324\": rpc error: code = NotFound desc = could not find container \"3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324\": container with ID starting with 3bac02b83665cd9a99f652e3fb747d5c4a4c69d54c1bd8c00507fb3e75fb3324 not found: ID does not exist" Mar 11 10:21:41 crc kubenswrapper[4830]: I0311 10:21:41.045345 4830 scope.go:117] "RemoveContainer" containerID="a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0" Mar 11 10:21:41 crc kubenswrapper[4830]: E0311 10:21:41.046039 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0\": container with ID starting with a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0 not found: ID does not exist" containerID="a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0" Mar 11 10:21:41 crc kubenswrapper[4830]: I0311 10:21:41.046088 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0"} err="failed to get container status \"a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0\": rpc error: code = NotFound desc = could not find container \"a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0\": container with ID starting with a44d74b5abdbf657b13088e65591f46743d97c282d64a021c66e4187e90554c0 not found: ID does not exist" Mar 11 10:21:41 crc kubenswrapper[4830]: I0311 10:21:41.046120 4830 scope.go:117] "RemoveContainer" containerID="3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338" Mar 11 10:21:41 crc kubenswrapper[4830]: E0311 10:21:41.046423 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338\": container with ID starting with 3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338 not found: ID does not exist" containerID="3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338" Mar 11 10:21:41 crc kubenswrapper[4830]: I0311 10:21:41.046451 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338"} err="failed to get container status \"3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338\": rpc error: code = NotFound desc = could not find container \"3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338\": container with ID starting with 3534df1cebf684529c8ca238ecbdad86c7d9eee6f4fb4f06a394dc12bc9b1338 not found: ID does not exist" Mar 11 10:21:42 crc kubenswrapper[4830]: I0311 10:21:42.943335 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" path="/var/lib/kubelet/pods/59352d4d-febc-46f3-9339-2a5a4876dcff/volumes" Mar 11 10:21:46 crc kubenswrapper[4830]: I0311 10:21:46.599994 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/util/0.log" Mar 11 10:21:46 crc kubenswrapper[4830]: I0311 10:21:46.765214 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/pull/0.log" Mar 11 10:21:46 crc kubenswrapper[4830]: I0311 10:21:46.772143 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/util/0.log" Mar 11 10:21:46 crc kubenswrapper[4830]: I0311 10:21:46.788327 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/pull/0.log" Mar 11 10:21:46 crc kubenswrapper[4830]: I0311 10:21:46.994365 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/extract/0.log" Mar 11 10:21:47 crc kubenswrapper[4830]: I0311 10:21:47.237197 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/util/0.log" Mar 11 10:21:47 crc kubenswrapper[4830]: I0311 10:21:47.255466 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0f6fedc9fb108f7c15f273fd1d9ba8f07f2e9db3e612f1eb776a73860dtqtkf_5e7bf2dc-7510-484d-8531-fe0bf51767c3/pull/0.log" Mar 11 10:21:47 crc kubenswrapper[4830]: I0311 10:21:47.760991 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-n5khc_16121653-f66c-441b-b1e2-8cd3c1e558e4/manager/0.log" Mar 11 10:21:48 crc kubenswrapper[4830]: I0311 10:21:48.070998 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-7kzdv_d7442149-a02a-401b-b3bd-c1d470af5b3b/manager/0.log" Mar 11 10:21:48 crc kubenswrapper[4830]: I0311 10:21:48.193513 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-xg456_3112f394-9b8e-43c2-9707-94ac1a2778db/manager/0.log" Mar 11 10:21:48 crc kubenswrapper[4830]: I0311 10:21:48.466471 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-7cg7m_283b8bf7-046d-4600-be30-f578a6ec3c4d/manager/0.log" Mar 11 10:21:49 crc kubenswrapper[4830]: I0311 10:21:49.044229 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-mtd4g_fe16642f-b4c0-45e6-b222-83fcc2c3fb5c/manager/0.log" Mar 11 10:21:49 crc kubenswrapper[4830]: I0311 10:21:49.124244 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-ntcnm_74ba9d62-2d47-46a5-bd26-1a81bb0a8484/manager/0.log" Mar 11 10:21:49 crc kubenswrapper[4830]: I0311 10:21:49.448984 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-bcbp5_1ab4c9a1-d2a6-422f-b7ed-b6306f1fb38f/manager/0.log" Mar 11 10:21:49 crc kubenswrapper[4830]: I0311 10:21:49.622092 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-rjsvm_ceffcca8-5182-4f52-b359-e20664c1d527/manager/0.log" Mar 11 10:21:49 crc kubenswrapper[4830]: I0311 10:21:49.668454 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-9pvn7_2a85b060-3965-4d51-b568-2b360fee4c44/manager/0.log" Mar 11 10:21:49 crc kubenswrapper[4830]: I0311 10:21:49.874833 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-q98j9_c29c2a15-0eb3-41aa-b0b9-710a5ed56a87/manager/0.log" Mar 11 10:21:50 crc kubenswrapper[4830]: I0311 10:21:50.104919 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-5gkz6_6ee90085-fc25-4491-a2fc-9b45d5d8207a/manager/0.log" Mar 11 10:21:50 crc kubenswrapper[4830]: I0311 10:21:50.244517 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-c5vgk_2525ca9b-eb81-4fce-86b4-a767db795de6/manager/0.log" Mar 11 10:21:50 crc kubenswrapper[4830]: I0311 10:21:50.409836 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-kwh58_5602b15d-928b-4138-a7f0-66f8e8d037b8/manager/0.log" Mar 11 10:21:50 crc kubenswrapper[4830]: I0311 10:21:50.594812 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7fhd9h_94d241ed-64bc-4152-b445-51ae5a61bb95/manager/0.log" Mar 11 10:21:50 crc kubenswrapper[4830]: I0311 10:21:50.949319 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-67d889964b-xg4rh_843e8d8e-4cb5-4260-af55-147e416c0791/operator/0.log" Mar 11 10:21:51 crc kubenswrapper[4830]: I0311 10:21:51.071907 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-6wtvn_fdd7a557-69d6-4baf-89e5-a8bf6219aaa0/registry-server/0.log" Mar 11 10:21:51 crc kubenswrapper[4830]: I0311 10:21:51.282070 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-xmctv_f6f6b27c-94d8-456d-8d41-19c905065e1d/manager/0.log" Mar 11 10:21:51 crc kubenswrapper[4830]: I0311 10:21:51.466401 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-57m9k_dedc5b41-d549-4015-b010-bc07cea3d318/manager/0.log" Mar 11 10:21:51 crc kubenswrapper[4830]: I0311 10:21:51.543405 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-nkqm4_14e5e0c3-1203-4a07-93bd-94578a7f0cb2/operator/0.log" Mar 11 10:21:51 crc kubenswrapper[4830]: I0311 10:21:51.782541 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-6rnp4_15d414b6-a515-4db4-b60c-a2b34004ea9c/manager/0.log" Mar 11 10:21:52 crc kubenswrapper[4830]: I0311 10:21:52.039150 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-jkths_c0c6f3c5-14c3-401f-90b4-3946ffc7e5e0/manager/0.log" Mar 11 10:21:52 crc kubenswrapper[4830]: I0311 10:21:52.277190 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-jhcz9_f488d4b3-55b5-424e-b00e-0bd262fc5f4f/manager/0.log" Mar 11 10:21:52 crc kubenswrapper[4830]: I0311 10:21:52.628319 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fcc5fcbf7-mw66h_f24d67a9-4996-4315-8c38-fa4ef58e0a52/manager/0.log" Mar 11 10:21:52 crc kubenswrapper[4830]: I0311 10:21:52.875120 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-vjktt_733a981c-36a1-442b-8e24-16a7498efc54/manager/0.log" Mar 11 10:21:57 crc kubenswrapper[4830]: I0311 10:21:57.307352 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-cxpww_d72c70bc-5f58-4c0f-a584-f352adf175e7/manager/0.log" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.150774 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553742-m57cf"] Mar 11 10:22:00 crc kubenswrapper[4830]: E0311 10:22:00.151807 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="extract-utilities" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.151825 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="extract-utilities" Mar 11 10:22:00 crc kubenswrapper[4830]: E0311 10:22:00.151840 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="extract-content" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.151848 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="extract-content" Mar 11 10:22:00 crc kubenswrapper[4830]: E0311 10:22:00.151868 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="registry-server" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.151877 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="registry-server" Mar 11 10:22:00 crc kubenswrapper[4830]: E0311 10:22:00.151893 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="extract-utilities" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.151902 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="extract-utilities" Mar 11 10:22:00 crc kubenswrapper[4830]: E0311 10:22:00.151934 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="extract-content" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.151942 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="extract-content" Mar 11 10:22:00 crc kubenswrapper[4830]: E0311 10:22:00.151961 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="registry-server" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.151968 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="registry-server" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.152221 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15c3721-d6b8-4724-a7d3-429a5123201a" containerName="registry-server" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.152250 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="59352d4d-febc-46f3-9339-2a5a4876dcff" containerName="registry-server" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.153075 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553742-m57cf" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.155145 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.155590 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.157791 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.162016 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553742-m57cf"] Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.248523 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w5gz\" (UniqueName: \"kubernetes.io/projected/a4886d88-dbe8-48c2-a70d-dfeb1027535a-kube-api-access-2w5gz\") pod \"auto-csr-approver-29553742-m57cf\" (UID: \"a4886d88-dbe8-48c2-a70d-dfeb1027535a\") " pod="openshift-infra/auto-csr-approver-29553742-m57cf" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.350601 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w5gz\" (UniqueName: \"kubernetes.io/projected/a4886d88-dbe8-48c2-a70d-dfeb1027535a-kube-api-access-2w5gz\") pod \"auto-csr-approver-29553742-m57cf\" (UID: \"a4886d88-dbe8-48c2-a70d-dfeb1027535a\") " pod="openshift-infra/auto-csr-approver-29553742-m57cf" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.369518 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w5gz\" (UniqueName: \"kubernetes.io/projected/a4886d88-dbe8-48c2-a70d-dfeb1027535a-kube-api-access-2w5gz\") pod \"auto-csr-approver-29553742-m57cf\" (UID: \"a4886d88-dbe8-48c2-a70d-dfeb1027535a\") " pod="openshift-infra/auto-csr-approver-29553742-m57cf" Mar 11 10:22:00 crc kubenswrapper[4830]: I0311 10:22:00.469505 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553742-m57cf" Mar 11 10:22:01 crc kubenswrapper[4830]: I0311 10:22:01.164748 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553742-m57cf"] Mar 11 10:22:02 crc kubenswrapper[4830]: I0311 10:22:02.171136 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553742-m57cf" event={"ID":"a4886d88-dbe8-48c2-a70d-dfeb1027535a","Type":"ContainerStarted","Data":"e7c3e6e3137d6dcc32838f8a89b3ec9dad9f4ccd66b6c4202c474008c90d5536"} Mar 11 10:22:03 crc kubenswrapper[4830]: I0311 10:22:03.181598 4830 generic.go:334] "Generic (PLEG): container finished" podID="a4886d88-dbe8-48c2-a70d-dfeb1027535a" containerID="8b59d263de4e240e4a3f54675b2c8f260a88ea7fe9408abdaa50664e8bf0b00e" exitCode=0 Mar 11 10:22:03 crc kubenswrapper[4830]: I0311 10:22:03.181684 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553742-m57cf" event={"ID":"a4886d88-dbe8-48c2-a70d-dfeb1027535a","Type":"ContainerDied","Data":"8b59d263de4e240e4a3f54675b2c8f260a88ea7fe9408abdaa50664e8bf0b00e"} Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.332075 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9bptp"] Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.336181 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.344285 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9bptp"] Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.428317 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-utilities\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.428445 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-catalog-content\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.428515 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xctbt\" (UniqueName: \"kubernetes.io/projected/abb67b42-f206-4ec2-adb3-e95a8759959f-kube-api-access-xctbt\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.529765 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-catalog-content\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.529876 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xctbt\" (UniqueName: \"kubernetes.io/projected/abb67b42-f206-4ec2-adb3-e95a8759959f-kube-api-access-xctbt\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.529922 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-utilities\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.530246 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-catalog-content\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.530351 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-utilities\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.788604 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xctbt\" (UniqueName: \"kubernetes.io/projected/abb67b42-f206-4ec2-adb3-e95a8759959f-kube-api-access-xctbt\") pod \"certified-operators-9bptp\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:04 crc kubenswrapper[4830]: I0311 10:22:04.981933 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.011386 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553742-m57cf" Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.140243 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w5gz\" (UniqueName: \"kubernetes.io/projected/a4886d88-dbe8-48c2-a70d-dfeb1027535a-kube-api-access-2w5gz\") pod \"a4886d88-dbe8-48c2-a70d-dfeb1027535a\" (UID: \"a4886d88-dbe8-48c2-a70d-dfeb1027535a\") " Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.159298 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4886d88-dbe8-48c2-a70d-dfeb1027535a-kube-api-access-2w5gz" (OuterVolumeSpecName: "kube-api-access-2w5gz") pod "a4886d88-dbe8-48c2-a70d-dfeb1027535a" (UID: "a4886d88-dbe8-48c2-a70d-dfeb1027535a"). InnerVolumeSpecName "kube-api-access-2w5gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.243062 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w5gz\" (UniqueName: \"kubernetes.io/projected/a4886d88-dbe8-48c2-a70d-dfeb1027535a-kube-api-access-2w5gz\") on node \"crc\" DevicePath \"\"" Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.257556 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553742-m57cf" event={"ID":"a4886d88-dbe8-48c2-a70d-dfeb1027535a","Type":"ContainerDied","Data":"e7c3e6e3137d6dcc32838f8a89b3ec9dad9f4ccd66b6c4202c474008c90d5536"} Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.257597 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c3e6e3137d6dcc32838f8a89b3ec9dad9f4ccd66b6c4202c474008c90d5536" Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.257603 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553742-m57cf" Mar 11 10:22:05 crc kubenswrapper[4830]: E0311 10:22:05.476387 4830 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4886d88_dbe8_48c2_a70d_dfeb1027535a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4886d88_dbe8_48c2_a70d_dfeb1027535a.slice/crio-e7c3e6e3137d6dcc32838f8a89b3ec9dad9f4ccd66b6c4202c474008c90d5536\": RecentStats: unable to find data in memory cache]" Mar 11 10:22:05 crc kubenswrapper[4830]: I0311 10:22:05.551906 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9bptp"] Mar 11 10:22:05 crc kubenswrapper[4830]: W0311 10:22:05.563072 4830 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabb67b42_f206_4ec2_adb3_e95a8759959f.slice/crio-c02adfaa1df636f38a069a7cb57a4f4bc5782839e8b7ec400df36a918db72db6 WatchSource:0}: Error finding container c02adfaa1df636f38a069a7cb57a4f4bc5782839e8b7ec400df36a918db72db6: Status 404 returned error can't find the container with id c02adfaa1df636f38a069a7cb57a4f4bc5782839e8b7ec400df36a918db72db6 Mar 11 10:22:06 crc kubenswrapper[4830]: I0311 10:22:06.083206 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553736-dpcrm"] Mar 11 10:22:06 crc kubenswrapper[4830]: I0311 10:22:06.091688 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553736-dpcrm"] Mar 11 10:22:06 crc kubenswrapper[4830]: I0311 10:22:06.267166 4830 generic.go:334] "Generic (PLEG): container finished" podID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerID="ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e" exitCode=0 Mar 11 10:22:06 crc kubenswrapper[4830]: I0311 10:22:06.267220 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9bptp" event={"ID":"abb67b42-f206-4ec2-adb3-e95a8759959f","Type":"ContainerDied","Data":"ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e"} Mar 11 10:22:06 crc kubenswrapper[4830]: I0311 10:22:06.267265 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9bptp" event={"ID":"abb67b42-f206-4ec2-adb3-e95a8759959f","Type":"ContainerStarted","Data":"c02adfaa1df636f38a069a7cb57a4f4bc5782839e8b7ec400df36a918db72db6"} Mar 11 10:22:06 crc kubenswrapper[4830]: I0311 10:22:06.945123 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6a38b8-7842-4127-9926-e477ab93d5d6" path="/var/lib/kubelet/pods/8d6a38b8-7842-4127-9926-e477ab93d5d6/volumes" Mar 11 10:22:07 crc kubenswrapper[4830]: I0311 10:22:07.277113 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9bptp" event={"ID":"abb67b42-f206-4ec2-adb3-e95a8759959f","Type":"ContainerStarted","Data":"2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa"} Mar 11 10:22:08 crc kubenswrapper[4830]: I0311 10:22:08.288723 4830 generic.go:334] "Generic (PLEG): container finished" podID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerID="2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa" exitCode=0 Mar 11 10:22:08 crc kubenswrapper[4830]: I0311 10:22:08.288812 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9bptp" event={"ID":"abb67b42-f206-4ec2-adb3-e95a8759959f","Type":"ContainerDied","Data":"2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa"} Mar 11 10:22:09 crc kubenswrapper[4830]: I0311 10:22:09.302570 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9bptp" event={"ID":"abb67b42-f206-4ec2-adb3-e95a8759959f","Type":"ContainerStarted","Data":"72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0"} Mar 11 10:22:09 crc kubenswrapper[4830]: I0311 10:22:09.321658 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9bptp" podStartSLOduration=2.862492829 podStartE2EDuration="5.32163674s" podCreationTimestamp="2026-03-11 10:22:04 +0000 UTC" firstStartedPulling="2026-03-11 10:22:06.269421137 +0000 UTC m=+4094.050571836" lastFinishedPulling="2026-03-11 10:22:08.728565058 +0000 UTC m=+4096.509715747" observedRunningTime="2026-03-11 10:22:09.317875257 +0000 UTC m=+4097.099026016" watchObservedRunningTime="2026-03-11 10:22:09.32163674 +0000 UTC m=+4097.102787439" Mar 11 10:22:13 crc kubenswrapper[4830]: I0311 10:22:13.060494 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:22:13 crc kubenswrapper[4830]: I0311 10:22:13.061074 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:22:14 crc kubenswrapper[4830]: I0311 10:22:14.446583 4830 scope.go:117] "RemoveContainer" containerID="732cf52308a6eb0823c629e97f48bfa4f3b65cb5f5077133cb6c8827f288954f" Mar 11 10:22:14 crc kubenswrapper[4830]: I0311 10:22:14.983757 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:14 crc kubenswrapper[4830]: I0311 10:22:14.984147 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:15 crc kubenswrapper[4830]: I0311 10:22:15.027517 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:15 crc kubenswrapper[4830]: I0311 10:22:15.073511 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s8cnh_fd48e28d-bcdd-4bba-a540-0213cda9599a/kube-rbac-proxy/0.log" Mar 11 10:22:15 crc kubenswrapper[4830]: I0311 10:22:15.083235 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-tqscv_cecdcf05-62ae-4e22-8e1a-9a9d85d9e51c/control-plane-machine-set-operator/0.log" Mar 11 10:22:15 crc kubenswrapper[4830]: I0311 10:22:15.218768 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-s8cnh_fd48e28d-bcdd-4bba-a540-0213cda9599a/machine-api-operator/0.log" Mar 11 10:22:15 crc kubenswrapper[4830]: I0311 10:22:15.403581 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:15 crc kubenswrapper[4830]: I0311 10:22:15.473840 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9bptp"] Mar 11 10:22:17 crc kubenswrapper[4830]: I0311 10:22:17.369666 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9bptp" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="registry-server" containerID="cri-o://72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0" gracePeriod=2 Mar 11 10:22:17 crc kubenswrapper[4830]: I0311 10:22:17.852643 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.011040 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-utilities\") pod \"abb67b42-f206-4ec2-adb3-e95a8759959f\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.011223 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-catalog-content\") pod \"abb67b42-f206-4ec2-adb3-e95a8759959f\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.011283 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xctbt\" (UniqueName: \"kubernetes.io/projected/abb67b42-f206-4ec2-adb3-e95a8759959f-kube-api-access-xctbt\") pod \"abb67b42-f206-4ec2-adb3-e95a8759959f\" (UID: \"abb67b42-f206-4ec2-adb3-e95a8759959f\") " Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.012543 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-utilities" (OuterVolumeSpecName: "utilities") pod "abb67b42-f206-4ec2-adb3-e95a8759959f" (UID: "abb67b42-f206-4ec2-adb3-e95a8759959f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.016658 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb67b42-f206-4ec2-adb3-e95a8759959f-kube-api-access-xctbt" (OuterVolumeSpecName: "kube-api-access-xctbt") pod "abb67b42-f206-4ec2-adb3-e95a8759959f" (UID: "abb67b42-f206-4ec2-adb3-e95a8759959f"). InnerVolumeSpecName "kube-api-access-xctbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.070073 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abb67b42-f206-4ec2-adb3-e95a8759959f" (UID: "abb67b42-f206-4ec2-adb3-e95a8759959f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.117679 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.117715 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xctbt\" (UniqueName: \"kubernetes.io/projected/abb67b42-f206-4ec2-adb3-e95a8759959f-kube-api-access-xctbt\") on node \"crc\" DevicePath \"\"" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.117728 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abb67b42-f206-4ec2-adb3-e95a8759959f-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.377342 4830 generic.go:334] "Generic (PLEG): container finished" podID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerID="72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0" exitCode=0 Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.377378 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9bptp" event={"ID":"abb67b42-f206-4ec2-adb3-e95a8759959f","Type":"ContainerDied","Data":"72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0"} Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.377401 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9bptp" event={"ID":"abb67b42-f206-4ec2-adb3-e95a8759959f","Type":"ContainerDied","Data":"c02adfaa1df636f38a069a7cb57a4f4bc5782839e8b7ec400df36a918db72db6"} Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.377407 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9bptp" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.377419 4830 scope.go:117] "RemoveContainer" containerID="72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.394324 4830 scope.go:117] "RemoveContainer" containerID="2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.415432 4830 scope.go:117] "RemoveContainer" containerID="ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.427769 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9bptp"] Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.438382 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9bptp"] Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.457129 4830 scope.go:117] "RemoveContainer" containerID="72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0" Mar 11 10:22:18 crc kubenswrapper[4830]: E0311 10:22:18.457501 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0\": container with ID starting with 72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0 not found: ID does not exist" containerID="72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.457533 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0"} err="failed to get container status \"72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0\": rpc error: code = NotFound desc = could not find container \"72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0\": container with ID starting with 72150dfd1dfd4b0375a1e43f3e8f29655b8df87f9a77124fa76d2e7025e053f0 not found: ID does not exist" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.457555 4830 scope.go:117] "RemoveContainer" containerID="2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa" Mar 11 10:22:18 crc kubenswrapper[4830]: E0311 10:22:18.457855 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa\": container with ID starting with 2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa not found: ID does not exist" containerID="2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.457873 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa"} err="failed to get container status \"2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa\": rpc error: code = NotFound desc = could not find container \"2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa\": container with ID starting with 2cc5d823cdc86cc5989ee7ddd51bb8e92e53605c01edaad8ec198b53dbaad4fa not found: ID does not exist" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.457884 4830 scope.go:117] "RemoveContainer" containerID="ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e" Mar 11 10:22:18 crc kubenswrapper[4830]: E0311 10:22:18.458116 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e\": container with ID starting with ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e not found: ID does not exist" containerID="ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.458138 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e"} err="failed to get container status \"ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e\": rpc error: code = NotFound desc = could not find container \"ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e\": container with ID starting with ab3478607b9c693f1b051cd4136aa6a8f510ddafa5e0becef4de5a772238d47e not found: ID does not exist" Mar 11 10:22:18 crc kubenswrapper[4830]: I0311 10:22:18.946182 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" path="/var/lib/kubelet/pods/abb67b42-f206-4ec2-adb3-e95a8759959f/volumes" Mar 11 10:22:27 crc kubenswrapper[4830]: I0311 10:22:27.258410 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-s7blt_1ef3b4a5-606d-4ed7-ba6f-be2095e5d84d/cert-manager-controller/0.log" Mar 11 10:22:27 crc kubenswrapper[4830]: I0311 10:22:27.421113 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-9vzgv_c952f67d-03e4-4c30-a44b-884f26d81c4e/cert-manager-cainjector/0.log" Mar 11 10:22:27 crc kubenswrapper[4830]: I0311 10:22:27.459164 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-xlw9r_8278ba6d-7719-4b12-9f80-29867e6fc2ba/cert-manager-webhook/0.log" Mar 11 10:22:38 crc kubenswrapper[4830]: I0311 10:22:38.833209 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-ztp77_7bed2ffb-6685-4495-badf-1c70ea17d8fa/nmstate-console-plugin/0.log" Mar 11 10:22:39 crc kubenswrapper[4830]: I0311 10:22:39.071487 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6gnfd_3dec7623-b8d0-4aa6-9a7f-0796475bcaaf/nmstate-handler/0.log" Mar 11 10:22:39 crc kubenswrapper[4830]: I0311 10:22:39.111496 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qgmxm_20718750-ed46-4785-b2ca-0e41dfd093be/kube-rbac-proxy/0.log" Mar 11 10:22:39 crc kubenswrapper[4830]: I0311 10:22:39.186843 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qgmxm_20718750-ed46-4785-b2ca-0e41dfd093be/nmstate-metrics/0.log" Mar 11 10:22:39 crc kubenswrapper[4830]: I0311 10:22:39.301629 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-zqpc6_b282dd08-59c0-4a26-a7a0-e165dfc899b6/nmstate-operator/0.log" Mar 11 10:22:39 crc kubenswrapper[4830]: I0311 10:22:39.409324 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-vzfnr_9e81d681-fa0d-4789-8762-ee953dc9f5aa/nmstate-webhook/0.log" Mar 11 10:22:43 crc kubenswrapper[4830]: I0311 10:22:43.060715 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:22:43 crc kubenswrapper[4830]: I0311 10:22:43.061050 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.015246 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qg75l_29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be/kube-rbac-proxy/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.220773 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qg75l_29d5f39f-8a74-4d04-9bf0-d0ee24cfb1be/controller/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.428923 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.625376 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.637604 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.654996 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.659841 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.914657 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.939924 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:23:07 crc kubenswrapper[4830]: I0311 10:23:07.987372 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.007930 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.216934 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-frr-files/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.222874 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-reloader/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.267042 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/cp-metrics/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.307793 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/controller/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.416121 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/frr-metrics/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.493422 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/kube-rbac-proxy/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.534534 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/kube-rbac-proxy-frr/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.658536 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/reloader/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.765680 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-svpm2_bd1c2f0c-c126-4cd9-863d-6ec94f3920ba/frr-k8s-webhook-server/0.log" Mar 11 10:23:08 crc kubenswrapper[4830]: I0311 10:23:08.974817 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85c56b6668-mc4fj_e3733633-d23b-4ef9-90cf-89614677589d/manager/0.log" Mar 11 10:23:09 crc kubenswrapper[4830]: I0311 10:23:09.144476 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-d9865c9bc-zgs5r_847b6273-e498-4025-a834-41173cfce564/webhook-server/0.log" Mar 11 10:23:09 crc kubenswrapper[4830]: I0311 10:23:09.221820 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gmrbg_08cd58a8-ee9e-44a8-874f-2187733e6d57/kube-rbac-proxy/0.log" Mar 11 10:23:09 crc kubenswrapper[4830]: I0311 10:23:09.819869 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gmrbg_08cd58a8-ee9e-44a8-874f-2187733e6d57/speaker/0.log" Mar 11 10:23:10 crc kubenswrapper[4830]: I0311 10:23:10.338815 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mvhbg_969c667a-e499-4c6a-9da3-d7813886c794/frr/0.log" Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.060420 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.060735 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.060779 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.061544 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"faa8f29715aee87d768668d7a8b45badc0d9d0b5b2bf527a7d66e539bd3a1912"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.061603 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://faa8f29715aee87d768668d7a8b45badc0d9d0b5b2bf527a7d66e539bd3a1912" gracePeriod=600 Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.876195 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="faa8f29715aee87d768668d7a8b45badc0d9d0b5b2bf527a7d66e539bd3a1912" exitCode=0 Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.876389 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"faa8f29715aee87d768668d7a8b45badc0d9d0b5b2bf527a7d66e539bd3a1912"} Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.876815 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerStarted","Data":"2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a"} Mar 11 10:23:13 crc kubenswrapper[4830]: I0311 10:23:13.876893 4830 scope.go:117] "RemoveContainer" containerID="29d9e18cab9cb2341cf057e18ac6780a09aa1a038c35c0c6f120f8789f4456c2" Mar 11 10:23:23 crc kubenswrapper[4830]: I0311 10:23:23.971635 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/util/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.165741 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/pull/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.176947 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/util/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.190591 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/pull/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.419233 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/pull/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.451661 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/util/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.484280 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874hjft8_9c57794f-6ae4-4350-9a03-efe9a10f5d47/extract/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.631272 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/util/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.873860 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/util/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.878094 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/pull/0.log" Mar 11 10:23:24 crc kubenswrapper[4830]: I0311 10:23:24.882774 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/pull/0.log" Mar 11 10:23:25 crc kubenswrapper[4830]: I0311 10:23:25.102818 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/extract/0.log" Mar 11 10:23:25 crc kubenswrapper[4830]: I0311 10:23:25.107386 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/pull/0.log" Mar 11 10:23:25 crc kubenswrapper[4830]: I0311 10:23:25.123326 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1sr5ql_c7147327-7c7b-4a1d-94c7-c684ec5337a0/util/0.log" Mar 11 10:23:25 crc kubenswrapper[4830]: I0311 10:23:25.758385 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-utilities/0.log" Mar 11 10:23:25 crc kubenswrapper[4830]: I0311 10:23:25.908426 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-content/0.log" Mar 11 10:23:25 crc kubenswrapper[4830]: I0311 10:23:25.912158 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-utilities/0.log" Mar 11 10:23:25 crc kubenswrapper[4830]: I0311 10:23:25.923253 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-content/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.156784 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-content/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.165833 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/extract-utilities/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.393039 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-utilities/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.566260 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-sdfbn_f470d035-fb23-4ecc-b36e-b61886bfab43/registry-server/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.693744 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-utilities/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.724652 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-content/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.757344 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-content/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.887464 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-utilities/0.log" Mar 11 10:23:26 crc kubenswrapper[4830]: I0311 10:23:26.895407 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/extract-content/0.log" Mar 11 10:23:27 crc kubenswrapper[4830]: I0311 10:23:27.133405 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-vjsls_74cca8dd-8cb4-41db-9626-9612877ad60e/marketplace-operator/0.log" Mar 11 10:23:27 crc kubenswrapper[4830]: I0311 10:23:27.616890 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-utilities/0.log" Mar 11 10:23:27 crc kubenswrapper[4830]: I0311 10:23:27.740893 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-nt9xr_47ec45e2-dfcb-4c82-955a-0f820e2d0210/registry-server/0.log" Mar 11 10:23:27 crc kubenswrapper[4830]: I0311 10:23:27.820940 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-utilities/0.log" Mar 11 10:23:27 crc kubenswrapper[4830]: I0311 10:23:27.854600 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-content/0.log" Mar 11 10:23:27 crc kubenswrapper[4830]: I0311 10:23:27.857428 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-content/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.090409 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-utilities/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.129629 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-utilities/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.151328 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/extract-content/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.202194 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b525m_878a9e87-f5ea-4c96-8439-28ccc445778b/registry-server/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.367285 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-content/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.389585 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-utilities/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.483781 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-content/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.620748 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-content/0.log" Mar 11 10:23:28 crc kubenswrapper[4830]: I0311 10:23:28.657770 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/extract-utilities/0.log" Mar 11 10:23:29 crc kubenswrapper[4830]: I0311 10:23:29.204689 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c85g7_3248986c-2bdb-4095-87d0-07eaf6acd7b1/registry-server/0.log" Mar 11 10:23:54 crc kubenswrapper[4830]: E0311 10:23:54.603275 4830 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.169:60222->38.102.83.169:36145: write tcp 38.102.83.169:60222->38.102.83.169:36145: write: broken pipe Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.145898 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553744-rtkfr"] Mar 11 10:24:00 crc kubenswrapper[4830]: E0311 10:24:00.146970 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4886d88-dbe8-48c2-a70d-dfeb1027535a" containerName="oc" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.146989 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4886d88-dbe8-48c2-a70d-dfeb1027535a" containerName="oc" Mar 11 10:24:00 crc kubenswrapper[4830]: E0311 10:24:00.147000 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="extract-utilities" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.147008 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="extract-utilities" Mar 11 10:24:00 crc kubenswrapper[4830]: E0311 10:24:00.147047 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="extract-content" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.147056 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="extract-content" Mar 11 10:24:00 crc kubenswrapper[4830]: E0311 10:24:00.147070 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="registry-server" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.147077 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="registry-server" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.147290 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb67b42-f206-4ec2-adb3-e95a8759959f" containerName="registry-server" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.147324 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4886d88-dbe8-48c2-a70d-dfeb1027535a" containerName="oc" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.148124 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553744-rtkfr" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.150296 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.150503 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.159246 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553744-rtkfr"] Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.164884 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.263390 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsfg7\" (UniqueName: \"kubernetes.io/projected/50b1b2ba-998b-44ac-a5f8-e78c995939bc-kube-api-access-jsfg7\") pod \"auto-csr-approver-29553744-rtkfr\" (UID: \"50b1b2ba-998b-44ac-a5f8-e78c995939bc\") " pod="openshift-infra/auto-csr-approver-29553744-rtkfr" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.364854 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsfg7\" (UniqueName: \"kubernetes.io/projected/50b1b2ba-998b-44ac-a5f8-e78c995939bc-kube-api-access-jsfg7\") pod \"auto-csr-approver-29553744-rtkfr\" (UID: \"50b1b2ba-998b-44ac-a5f8-e78c995939bc\") " pod="openshift-infra/auto-csr-approver-29553744-rtkfr" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.388252 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsfg7\" (UniqueName: \"kubernetes.io/projected/50b1b2ba-998b-44ac-a5f8-e78c995939bc-kube-api-access-jsfg7\") pod \"auto-csr-approver-29553744-rtkfr\" (UID: \"50b1b2ba-998b-44ac-a5f8-e78c995939bc\") " pod="openshift-infra/auto-csr-approver-29553744-rtkfr" Mar 11 10:24:00 crc kubenswrapper[4830]: I0311 10:24:00.469418 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553744-rtkfr" Mar 11 10:24:01 crc kubenswrapper[4830]: I0311 10:24:01.485203 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553744-rtkfr"] Mar 11 10:24:02 crc kubenswrapper[4830]: I0311 10:24:02.351893 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553744-rtkfr" event={"ID":"50b1b2ba-998b-44ac-a5f8-e78c995939bc","Type":"ContainerStarted","Data":"7bbdb12387fdbffbe62fa979214367c844d25d7e424505d7c7d4946d331d5f34"} Mar 11 10:24:03 crc kubenswrapper[4830]: I0311 10:24:03.362608 4830 generic.go:334] "Generic (PLEG): container finished" podID="50b1b2ba-998b-44ac-a5f8-e78c995939bc" containerID="aba0618c18a11d84ea1fe9c7a5feb3debc279ea8471616ecb064f0c06d4e20bb" exitCode=0 Mar 11 10:24:03 crc kubenswrapper[4830]: I0311 10:24:03.362747 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553744-rtkfr" event={"ID":"50b1b2ba-998b-44ac-a5f8-e78c995939bc","Type":"ContainerDied","Data":"aba0618c18a11d84ea1fe9c7a5feb3debc279ea8471616ecb064f0c06d4e20bb"} Mar 11 10:24:04 crc kubenswrapper[4830]: I0311 10:24:04.909562 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553744-rtkfr" Mar 11 10:24:05 crc kubenswrapper[4830]: I0311 10:24:05.060302 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsfg7\" (UniqueName: \"kubernetes.io/projected/50b1b2ba-998b-44ac-a5f8-e78c995939bc-kube-api-access-jsfg7\") pod \"50b1b2ba-998b-44ac-a5f8-e78c995939bc\" (UID: \"50b1b2ba-998b-44ac-a5f8-e78c995939bc\") " Mar 11 10:24:05 crc kubenswrapper[4830]: I0311 10:24:05.066223 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b1b2ba-998b-44ac-a5f8-e78c995939bc-kube-api-access-jsfg7" (OuterVolumeSpecName: "kube-api-access-jsfg7") pod "50b1b2ba-998b-44ac-a5f8-e78c995939bc" (UID: "50b1b2ba-998b-44ac-a5f8-e78c995939bc"). InnerVolumeSpecName "kube-api-access-jsfg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:24:05 crc kubenswrapper[4830]: I0311 10:24:05.162330 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsfg7\" (UniqueName: \"kubernetes.io/projected/50b1b2ba-998b-44ac-a5f8-e78c995939bc-kube-api-access-jsfg7\") on node \"crc\" DevicePath \"\"" Mar 11 10:24:05 crc kubenswrapper[4830]: I0311 10:24:05.382927 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553744-rtkfr" event={"ID":"50b1b2ba-998b-44ac-a5f8-e78c995939bc","Type":"ContainerDied","Data":"7bbdb12387fdbffbe62fa979214367c844d25d7e424505d7c7d4946d331d5f34"} Mar 11 10:24:05 crc kubenswrapper[4830]: I0311 10:24:05.382969 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bbdb12387fdbffbe62fa979214367c844d25d7e424505d7c7d4946d331d5f34" Mar 11 10:24:05 crc kubenswrapper[4830]: I0311 10:24:05.383012 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553744-rtkfr" Mar 11 10:24:05 crc kubenswrapper[4830]: I0311 10:24:05.986124 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553738-5m2l9"] Mar 11 10:24:06 crc kubenswrapper[4830]: I0311 10:24:06.000691 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553738-5m2l9"] Mar 11 10:24:06 crc kubenswrapper[4830]: I0311 10:24:06.943586 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd741083-a35c-4930-a1fa-c16df13d01af" path="/var/lib/kubelet/pods/fd741083-a35c-4930-a1fa-c16df13d01af/volumes" Mar 11 10:24:14 crc kubenswrapper[4830]: I0311 10:24:14.572735 4830 scope.go:117] "RemoveContainer" containerID="b1e21e68b3d6621568ba6a422cd0477f13cfe36c88a3cf6439c31ac61f88a7c7" Mar 11 10:25:13 crc kubenswrapper[4830]: I0311 10:25:13.060979 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:25:13 crc kubenswrapper[4830]: I0311 10:25:13.061537 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:25:24 crc kubenswrapper[4830]: I0311 10:25:24.265295 4830 generic.go:334] "Generic (PLEG): container finished" podID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerID="8de9cf0cc7d19423c3cb92479f282ea845709ffe47b94d4fc06a486d914c283f" exitCode=0 Mar 11 10:25:24 crc kubenswrapper[4830]: I0311 10:25:24.265396 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" event={"ID":"50a6aad1-7754-45c2-8c6e-ce71e051cd18","Type":"ContainerDied","Data":"8de9cf0cc7d19423c3cb92479f282ea845709ffe47b94d4fc06a486d914c283f"} Mar 11 10:25:24 crc kubenswrapper[4830]: I0311 10:25:24.267636 4830 scope.go:117] "RemoveContainer" containerID="8de9cf0cc7d19423c3cb92479f282ea845709ffe47b94d4fc06a486d914c283f" Mar 11 10:25:24 crc kubenswrapper[4830]: I0311 10:25:24.361936 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wlpqd_must-gather-8l2hl_50a6aad1-7754-45c2-8c6e-ce71e051cd18/gather/0.log" Mar 11 10:25:35 crc kubenswrapper[4830]: I0311 10:25:35.763042 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wlpqd/must-gather-8l2hl"] Mar 11 10:25:35 crc kubenswrapper[4830]: I0311 10:25:35.763901 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerName="copy" containerID="cri-o://8c1b68978767b2d02795c43c983500b5fe2dfa17b8ca28047a25f63e15c1472a" gracePeriod=2 Mar 11 10:25:35 crc kubenswrapper[4830]: I0311 10:25:35.774376 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wlpqd/must-gather-8l2hl"] Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.377351 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wlpqd_must-gather-8l2hl_50a6aad1-7754-45c2-8c6e-ce71e051cd18/copy/0.log" Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.377881 4830 generic.go:334] "Generic (PLEG): container finished" podID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerID="8c1b68978767b2d02795c43c983500b5fe2dfa17b8ca28047a25f63e15c1472a" exitCode=143 Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.590914 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wlpqd_must-gather-8l2hl_50a6aad1-7754-45c2-8c6e-ce71e051cd18/copy/0.log" Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.591388 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.700867 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/50a6aad1-7754-45c2-8c6e-ce71e051cd18-must-gather-output\") pod \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.701066 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg9xs\" (UniqueName: \"kubernetes.io/projected/50a6aad1-7754-45c2-8c6e-ce71e051cd18-kube-api-access-cg9xs\") pod \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\" (UID: \"50a6aad1-7754-45c2-8c6e-ce71e051cd18\") " Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.707796 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a6aad1-7754-45c2-8c6e-ce71e051cd18-kube-api-access-cg9xs" (OuterVolumeSpecName: "kube-api-access-cg9xs") pod "50a6aad1-7754-45c2-8c6e-ce71e051cd18" (UID: "50a6aad1-7754-45c2-8c6e-ce71e051cd18"). InnerVolumeSpecName "kube-api-access-cg9xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.804053 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg9xs\" (UniqueName: \"kubernetes.io/projected/50a6aad1-7754-45c2-8c6e-ce71e051cd18-kube-api-access-cg9xs\") on node \"crc\" DevicePath \"\"" Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.886753 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a6aad1-7754-45c2-8c6e-ce71e051cd18-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "50a6aad1-7754-45c2-8c6e-ce71e051cd18" (UID: "50a6aad1-7754-45c2-8c6e-ce71e051cd18"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.905873 4830 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/50a6aad1-7754-45c2-8c6e-ce71e051cd18-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 11 10:25:36 crc kubenswrapper[4830]: I0311 10:25:36.945933 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" path="/var/lib/kubelet/pods/50a6aad1-7754-45c2-8c6e-ce71e051cd18/volumes" Mar 11 10:25:37 crc kubenswrapper[4830]: I0311 10:25:37.396943 4830 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wlpqd_must-gather-8l2hl_50a6aad1-7754-45c2-8c6e-ce71e051cd18/copy/0.log" Mar 11 10:25:37 crc kubenswrapper[4830]: I0311 10:25:37.397654 4830 scope.go:117] "RemoveContainer" containerID="8c1b68978767b2d02795c43c983500b5fe2dfa17b8ca28047a25f63e15c1472a" Mar 11 10:25:37 crc kubenswrapper[4830]: I0311 10:25:37.397665 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wlpqd/must-gather-8l2hl" Mar 11 10:25:37 crc kubenswrapper[4830]: I0311 10:25:37.433959 4830 scope.go:117] "RemoveContainer" containerID="8de9cf0cc7d19423c3cb92479f282ea845709ffe47b94d4fc06a486d914c283f" Mar 11 10:25:43 crc kubenswrapper[4830]: I0311 10:25:43.060347 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:25:43 crc kubenswrapper[4830]: I0311 10:25:43.060808 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.145837 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553746-thwkk"] Mar 11 10:26:00 crc kubenswrapper[4830]: E0311 10:26:00.146800 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerName="copy" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.146813 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerName="copy" Mar 11 10:26:00 crc kubenswrapper[4830]: E0311 10:26:00.146831 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerName="gather" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.146837 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerName="gather" Mar 11 10:26:00 crc kubenswrapper[4830]: E0311 10:26:00.146845 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b1b2ba-998b-44ac-a5f8-e78c995939bc" containerName="oc" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.146851 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b1b2ba-998b-44ac-a5f8-e78c995939bc" containerName="oc" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.147058 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b1b2ba-998b-44ac-a5f8-e78c995939bc" containerName="oc" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.147075 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerName="copy" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.147094 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a6aad1-7754-45c2-8c6e-ce71e051cd18" containerName="gather" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.147740 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553746-thwkk" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.150973 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.151230 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.151358 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.160224 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553746-thwkk"] Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.170349 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2m5\" (UniqueName: \"kubernetes.io/projected/86da2375-6a6d-4d75-9848-bf16a0287553-kube-api-access-cz2m5\") pod \"auto-csr-approver-29553746-thwkk\" (UID: \"86da2375-6a6d-4d75-9848-bf16a0287553\") " pod="openshift-infra/auto-csr-approver-29553746-thwkk" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.271783 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2m5\" (UniqueName: \"kubernetes.io/projected/86da2375-6a6d-4d75-9848-bf16a0287553-kube-api-access-cz2m5\") pod \"auto-csr-approver-29553746-thwkk\" (UID: \"86da2375-6a6d-4d75-9848-bf16a0287553\") " pod="openshift-infra/auto-csr-approver-29553746-thwkk" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.291987 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2m5\" (UniqueName: \"kubernetes.io/projected/86da2375-6a6d-4d75-9848-bf16a0287553-kube-api-access-cz2m5\") pod \"auto-csr-approver-29553746-thwkk\" (UID: \"86da2375-6a6d-4d75-9848-bf16a0287553\") " pod="openshift-infra/auto-csr-approver-29553746-thwkk" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.470475 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553746-thwkk" Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.913750 4830 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 11 10:26:00 crc kubenswrapper[4830]: I0311 10:26:00.918945 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553746-thwkk"] Mar 11 10:26:01 crc kubenswrapper[4830]: I0311 10:26:01.597514 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553746-thwkk" event={"ID":"86da2375-6a6d-4d75-9848-bf16a0287553","Type":"ContainerStarted","Data":"d45e9dc41cd5b3c252ba3bdf40c7e6b1124769a53741730f89b707dc11c93577"} Mar 11 10:26:02 crc kubenswrapper[4830]: I0311 10:26:02.606729 4830 generic.go:334] "Generic (PLEG): container finished" podID="86da2375-6a6d-4d75-9848-bf16a0287553" containerID="f6a20b1b53417cb79cc248ed48c40761256e4ba3d4e1abec19fb654cb185e9ff" exitCode=0 Mar 11 10:26:02 crc kubenswrapper[4830]: I0311 10:26:02.606787 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553746-thwkk" event={"ID":"86da2375-6a6d-4d75-9848-bf16a0287553","Type":"ContainerDied","Data":"f6a20b1b53417cb79cc248ed48c40761256e4ba3d4e1abec19fb654cb185e9ff"} Mar 11 10:26:03 crc kubenswrapper[4830]: I0311 10:26:03.928532 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553746-thwkk" Mar 11 10:26:03 crc kubenswrapper[4830]: I0311 10:26:03.942279 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz2m5\" (UniqueName: \"kubernetes.io/projected/86da2375-6a6d-4d75-9848-bf16a0287553-kube-api-access-cz2m5\") pod \"86da2375-6a6d-4d75-9848-bf16a0287553\" (UID: \"86da2375-6a6d-4d75-9848-bf16a0287553\") " Mar 11 10:26:03 crc kubenswrapper[4830]: I0311 10:26:03.951682 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86da2375-6a6d-4d75-9848-bf16a0287553-kube-api-access-cz2m5" (OuterVolumeSpecName: "kube-api-access-cz2m5") pod "86da2375-6a6d-4d75-9848-bf16a0287553" (UID: "86da2375-6a6d-4d75-9848-bf16a0287553"). InnerVolumeSpecName "kube-api-access-cz2m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:26:04 crc kubenswrapper[4830]: I0311 10:26:04.047534 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz2m5\" (UniqueName: \"kubernetes.io/projected/86da2375-6a6d-4d75-9848-bf16a0287553-kube-api-access-cz2m5\") on node \"crc\" DevicePath \"\"" Mar 11 10:26:04 crc kubenswrapper[4830]: I0311 10:26:04.625877 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553746-thwkk" event={"ID":"86da2375-6a6d-4d75-9848-bf16a0287553","Type":"ContainerDied","Data":"d45e9dc41cd5b3c252ba3bdf40c7e6b1124769a53741730f89b707dc11c93577"} Mar 11 10:26:04 crc kubenswrapper[4830]: I0311 10:26:04.625915 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45e9dc41cd5b3c252ba3bdf40c7e6b1124769a53741730f89b707dc11c93577" Mar 11 10:26:04 crc kubenswrapper[4830]: I0311 10:26:04.625962 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553746-thwkk" Mar 11 10:26:04 crc kubenswrapper[4830]: I0311 10:26:04.998873 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553740-zlmdr"] Mar 11 10:26:05 crc kubenswrapper[4830]: I0311 10:26:05.006413 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553740-zlmdr"] Mar 11 10:26:06 crc kubenswrapper[4830]: I0311 10:26:06.953114 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1dd9ac-f029-479e-9e4f-f81d1810d084" path="/var/lib/kubelet/pods/ec1dd9ac-f029-479e-9e4f-f81d1810d084/volumes" Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.060467 4830 patch_prober.go:28] interesting pod/machine-config-daemon-p7jq8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.060819 4830 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.060884 4830 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.062014 4830 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a"} pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.062087 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerName="machine-config-daemon" containerID="cri-o://2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" gracePeriod=600 Mar 11 10:26:13 crc kubenswrapper[4830]: E0311 10:26:13.206378 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.707437 4830 generic.go:334] "Generic (PLEG): container finished" podID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" exitCode=0 Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.707518 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" event={"ID":"2bdde2fd-3db4-4b41-9287-58960dcab5d9","Type":"ContainerDied","Data":"2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a"} Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.707830 4830 scope.go:117] "RemoveContainer" containerID="faa8f29715aee87d768668d7a8b45badc0d9d0b5b2bf527a7d66e539bd3a1912" Mar 11 10:26:13 crc kubenswrapper[4830]: I0311 10:26:13.708427 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:26:13 crc kubenswrapper[4830]: E0311 10:26:13.708713 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:26:15 crc kubenswrapper[4830]: I0311 10:26:15.187013 4830 scope.go:117] "RemoveContainer" containerID="ce4edcc2b2ad947a2e21dc3e1aa6d69c4a16c087b530169b3716edffd9af53c0" Mar 11 10:26:27 crc kubenswrapper[4830]: I0311 10:26:27.933521 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:26:27 crc kubenswrapper[4830]: E0311 10:26:27.934643 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:26:39 crc kubenswrapper[4830]: I0311 10:26:39.932291 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:26:39 crc kubenswrapper[4830]: E0311 10:26:39.933057 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:26:51 crc kubenswrapper[4830]: I0311 10:26:51.933545 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:26:51 crc kubenswrapper[4830]: E0311 10:26:51.934353 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:27:06 crc kubenswrapper[4830]: I0311 10:27:06.932738 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:27:06 crc kubenswrapper[4830]: E0311 10:27:06.933559 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.876077 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8xkhm"] Mar 11 10:27:10 crc kubenswrapper[4830]: E0311 10:27:10.876791 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86da2375-6a6d-4d75-9848-bf16a0287553" containerName="oc" Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.876805 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="86da2375-6a6d-4d75-9848-bf16a0287553" containerName="oc" Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.876999 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="86da2375-6a6d-4d75-9848-bf16a0287553" containerName="oc" Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.878463 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.915441 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xkhm"] Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.953693 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-catalog-content\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.953751 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj76b\" (UniqueName: \"kubernetes.io/projected/c0e230e8-434f-4e8d-be61-1d4a0baadff4-kube-api-access-kj76b\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:10 crc kubenswrapper[4830]: I0311 10:27:10.953932 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-utilities\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.055441 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-catalog-content\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.055492 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj76b\" (UniqueName: \"kubernetes.io/projected/c0e230e8-434f-4e8d-be61-1d4a0baadff4-kube-api-access-kj76b\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.055527 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-utilities\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.055962 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-utilities\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.056492 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-catalog-content\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.089329 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj76b\" (UniqueName: \"kubernetes.io/projected/c0e230e8-434f-4e8d-be61-1d4a0baadff4-kube-api-access-kj76b\") pod \"community-operators-8xkhm\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.212567 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:11 crc kubenswrapper[4830]: I0311 10:27:11.724958 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xkhm"] Mar 11 10:27:12 crc kubenswrapper[4830]: I0311 10:27:12.235542 4830 generic.go:334] "Generic (PLEG): container finished" podID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerID="3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558" exitCode=0 Mar 11 10:27:12 crc kubenswrapper[4830]: I0311 10:27:12.235652 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xkhm" event={"ID":"c0e230e8-434f-4e8d-be61-1d4a0baadff4","Type":"ContainerDied","Data":"3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558"} Mar 11 10:27:12 crc kubenswrapper[4830]: I0311 10:27:12.235828 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xkhm" event={"ID":"c0e230e8-434f-4e8d-be61-1d4a0baadff4","Type":"ContainerStarted","Data":"c46c1693deeed06ead1035652fe6157d4abb3aaf6065b83fdd6ded076417d842"} Mar 11 10:27:13 crc kubenswrapper[4830]: I0311 10:27:13.246690 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xkhm" event={"ID":"c0e230e8-434f-4e8d-be61-1d4a0baadff4","Type":"ContainerStarted","Data":"a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f"} Mar 11 10:27:14 crc kubenswrapper[4830]: I0311 10:27:14.258574 4830 generic.go:334] "Generic (PLEG): container finished" podID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerID="a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f" exitCode=0 Mar 11 10:27:14 crc kubenswrapper[4830]: I0311 10:27:14.258626 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xkhm" event={"ID":"c0e230e8-434f-4e8d-be61-1d4a0baadff4","Type":"ContainerDied","Data":"a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f"} Mar 11 10:27:15 crc kubenswrapper[4830]: I0311 10:27:15.271916 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xkhm" event={"ID":"c0e230e8-434f-4e8d-be61-1d4a0baadff4","Type":"ContainerStarted","Data":"154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67"} Mar 11 10:27:15 crc kubenswrapper[4830]: I0311 10:27:15.301278 4830 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8xkhm" podStartSLOduration=2.782719233 podStartE2EDuration="5.301246989s" podCreationTimestamp="2026-03-11 10:27:10 +0000 UTC" firstStartedPulling="2026-03-11 10:27:12.237743897 +0000 UTC m=+4400.018894586" lastFinishedPulling="2026-03-11 10:27:14.756271653 +0000 UTC m=+4402.537422342" observedRunningTime="2026-03-11 10:27:15.291939075 +0000 UTC m=+4403.073089784" watchObservedRunningTime="2026-03-11 10:27:15.301246989 +0000 UTC m=+4403.082397678" Mar 11 10:27:19 crc kubenswrapper[4830]: I0311 10:27:19.932421 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:27:19 crc kubenswrapper[4830]: E0311 10:27:19.933240 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:27:21 crc kubenswrapper[4830]: I0311 10:27:21.213338 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:21 crc kubenswrapper[4830]: I0311 10:27:21.213426 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:21 crc kubenswrapper[4830]: I0311 10:27:21.734594 4830 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:21 crc kubenswrapper[4830]: I0311 10:27:21.785574 4830 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:21 crc kubenswrapper[4830]: I0311 10:27:21.971129 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xkhm"] Mar 11 10:27:23 crc kubenswrapper[4830]: I0311 10:27:23.345473 4830 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8xkhm" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="registry-server" containerID="cri-o://154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67" gracePeriod=2 Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.290916 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.355401 4830 generic.go:334] "Generic (PLEG): container finished" podID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerID="154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67" exitCode=0 Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.355440 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xkhm" event={"ID":"c0e230e8-434f-4e8d-be61-1d4a0baadff4","Type":"ContainerDied","Data":"154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67"} Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.355498 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xkhm" event={"ID":"c0e230e8-434f-4e8d-be61-1d4a0baadff4","Type":"ContainerDied","Data":"c46c1693deeed06ead1035652fe6157d4abb3aaf6065b83fdd6ded076417d842"} Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.355495 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xkhm" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.355526 4830 scope.go:117] "RemoveContainer" containerID="154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.375350 4830 scope.go:117] "RemoveContainer" containerID="a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.395129 4830 scope.go:117] "RemoveContainer" containerID="3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.438984 4830 scope.go:117] "RemoveContainer" containerID="154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67" Mar 11 10:27:24 crc kubenswrapper[4830]: E0311 10:27:24.439487 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67\": container with ID starting with 154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67 not found: ID does not exist" containerID="154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.439544 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67"} err="failed to get container status \"154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67\": rpc error: code = NotFound desc = could not find container \"154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67\": container with ID starting with 154c761515a9efd5be1dde83a22ca9b032bd2775339db47deebdce09abb34a67 not found: ID does not exist" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.439579 4830 scope.go:117] "RemoveContainer" containerID="a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f" Mar 11 10:27:24 crc kubenswrapper[4830]: E0311 10:27:24.439992 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f\": container with ID starting with a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f not found: ID does not exist" containerID="a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.440077 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f"} err="failed to get container status \"a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f\": rpc error: code = NotFound desc = could not find container \"a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f\": container with ID starting with a206bf7606c1db3a795d58d0a2062b6e68f1dd6a759a83cb40753cf53dbe822f not found: ID does not exist" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.440110 4830 scope.go:117] "RemoveContainer" containerID="3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558" Mar 11 10:27:24 crc kubenswrapper[4830]: E0311 10:27:24.440410 4830 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558\": container with ID starting with 3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558 not found: ID does not exist" containerID="3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.440445 4830 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558"} err="failed to get container status \"3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558\": rpc error: code = NotFound desc = could not find container \"3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558\": container with ID starting with 3c34ddf90d85b400b0131b40e5c7b4278a718a7f5b3fa16f307d71fdd4c14558 not found: ID does not exist" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.465401 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-catalog-content\") pod \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.465682 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj76b\" (UniqueName: \"kubernetes.io/projected/c0e230e8-434f-4e8d-be61-1d4a0baadff4-kube-api-access-kj76b\") pod \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.465717 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-utilities\") pod \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\" (UID: \"c0e230e8-434f-4e8d-be61-1d4a0baadff4\") " Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.466761 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-utilities" (OuterVolumeSpecName: "utilities") pod "c0e230e8-434f-4e8d-be61-1d4a0baadff4" (UID: "c0e230e8-434f-4e8d-be61-1d4a0baadff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.470932 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e230e8-434f-4e8d-be61-1d4a0baadff4-kube-api-access-kj76b" (OuterVolumeSpecName: "kube-api-access-kj76b") pod "c0e230e8-434f-4e8d-be61-1d4a0baadff4" (UID: "c0e230e8-434f-4e8d-be61-1d4a0baadff4"). InnerVolumeSpecName "kube-api-access-kj76b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.524801 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0e230e8-434f-4e8d-be61-1d4a0baadff4" (UID: "c0e230e8-434f-4e8d-be61-1d4a0baadff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.568855 4830 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.568914 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj76b\" (UniqueName: \"kubernetes.io/projected/c0e230e8-434f-4e8d-be61-1d4a0baadff4-kube-api-access-kj76b\") on node \"crc\" DevicePath \"\"" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.568929 4830 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0e230e8-434f-4e8d-be61-1d4a0baadff4-utilities\") on node \"crc\" DevicePath \"\"" Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.702778 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xkhm"] Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.711696 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8xkhm"] Mar 11 10:27:24 crc kubenswrapper[4830]: I0311 10:27:24.945968 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" path="/var/lib/kubelet/pods/c0e230e8-434f-4e8d-be61-1d4a0baadff4/volumes" Mar 11 10:27:33 crc kubenswrapper[4830]: I0311 10:27:33.932233 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:27:33 crc kubenswrapper[4830]: E0311 10:27:33.933010 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:27:44 crc kubenswrapper[4830]: I0311 10:27:44.933624 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:27:44 crc kubenswrapper[4830]: E0311 10:27:44.934396 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:27:58 crc kubenswrapper[4830]: I0311 10:27:58.933061 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:27:58 crc kubenswrapper[4830]: E0311 10:27:58.933944 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.146799 4830 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29553748-hcvml"] Mar 11 10:28:00 crc kubenswrapper[4830]: E0311 10:28:00.147674 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="registry-server" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.147692 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="registry-server" Mar 11 10:28:00 crc kubenswrapper[4830]: E0311 10:28:00.147732 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="extract-utilities" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.147742 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="extract-utilities" Mar 11 10:28:00 crc kubenswrapper[4830]: E0311 10:28:00.147768 4830 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="extract-content" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.147777 4830 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="extract-content" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.148044 4830 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0e230e8-434f-4e8d-be61-1d4a0baadff4" containerName="registry-server" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.148911 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553748-hcvml" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.150990 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.151585 4830 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qlw4q" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.152348 4830 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.157957 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553748-hcvml"] Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.287691 4830 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltp8p\" (UniqueName: \"kubernetes.io/projected/ff954cdc-677b-40ac-a9a1-fd7b283dc2a3-kube-api-access-ltp8p\") pod \"auto-csr-approver-29553748-hcvml\" (UID: \"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3\") " pod="openshift-infra/auto-csr-approver-29553748-hcvml" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.390634 4830 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltp8p\" (UniqueName: \"kubernetes.io/projected/ff954cdc-677b-40ac-a9a1-fd7b283dc2a3-kube-api-access-ltp8p\") pod \"auto-csr-approver-29553748-hcvml\" (UID: \"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3\") " pod="openshift-infra/auto-csr-approver-29553748-hcvml" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.409504 4830 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltp8p\" (UniqueName: \"kubernetes.io/projected/ff954cdc-677b-40ac-a9a1-fd7b283dc2a3-kube-api-access-ltp8p\") pod \"auto-csr-approver-29553748-hcvml\" (UID: \"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3\") " pod="openshift-infra/auto-csr-approver-29553748-hcvml" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.476951 4830 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553748-hcvml" Mar 11 10:28:00 crc kubenswrapper[4830]: I0311 10:28:00.956108 4830 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29553748-hcvml"] Mar 11 10:28:01 crc kubenswrapper[4830]: I0311 10:28:01.679291 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553748-hcvml" event={"ID":"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3","Type":"ContainerStarted","Data":"01db5e901a5c89586b962e7872b2b0cd480e05cdf31069c5496d4a51d6bdcded"} Mar 11 10:28:03 crc kubenswrapper[4830]: I0311 10:28:03.699335 4830 generic.go:334] "Generic (PLEG): container finished" podID="ff954cdc-677b-40ac-a9a1-fd7b283dc2a3" containerID="8a95d3f0bc0dd038c1ddd9e1ccf8769b73276f1a58e1aa776de4e1879b62ccf4" exitCode=0 Mar 11 10:28:03 crc kubenswrapper[4830]: I0311 10:28:03.699446 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553748-hcvml" event={"ID":"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3","Type":"ContainerDied","Data":"8a95d3f0bc0dd038c1ddd9e1ccf8769b73276f1a58e1aa776de4e1879b62ccf4"} Mar 11 10:28:05 crc kubenswrapper[4830]: I0311 10:28:05.004619 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553748-hcvml" Mar 11 10:28:05 crc kubenswrapper[4830]: I0311 10:28:05.079808 4830 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltp8p\" (UniqueName: \"kubernetes.io/projected/ff954cdc-677b-40ac-a9a1-fd7b283dc2a3-kube-api-access-ltp8p\") pod \"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3\" (UID: \"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3\") " Mar 11 10:28:05 crc kubenswrapper[4830]: I0311 10:28:05.086053 4830 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff954cdc-677b-40ac-a9a1-fd7b283dc2a3-kube-api-access-ltp8p" (OuterVolumeSpecName: "kube-api-access-ltp8p") pod "ff954cdc-677b-40ac-a9a1-fd7b283dc2a3" (UID: "ff954cdc-677b-40ac-a9a1-fd7b283dc2a3"). InnerVolumeSpecName "kube-api-access-ltp8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 11 10:28:05 crc kubenswrapper[4830]: I0311 10:28:05.183852 4830 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltp8p\" (UniqueName: \"kubernetes.io/projected/ff954cdc-677b-40ac-a9a1-fd7b283dc2a3-kube-api-access-ltp8p\") on node \"crc\" DevicePath \"\"" Mar 11 10:28:05 crc kubenswrapper[4830]: I0311 10:28:05.716975 4830 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29553748-hcvml" event={"ID":"ff954cdc-677b-40ac-a9a1-fd7b283dc2a3","Type":"ContainerDied","Data":"01db5e901a5c89586b962e7872b2b0cd480e05cdf31069c5496d4a51d6bdcded"} Mar 11 10:28:05 crc kubenswrapper[4830]: I0311 10:28:05.717324 4830 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01db5e901a5c89586b962e7872b2b0cd480e05cdf31069c5496d4a51d6bdcded" Mar 11 10:28:05 crc kubenswrapper[4830]: I0311 10:28:05.717043 4830 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29553748-hcvml" Mar 11 10:28:06 crc kubenswrapper[4830]: I0311 10:28:06.073564 4830 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29553742-m57cf"] Mar 11 10:28:06 crc kubenswrapper[4830]: I0311 10:28:06.082677 4830 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29553742-m57cf"] Mar 11 10:28:06 crc kubenswrapper[4830]: I0311 10:28:06.945774 4830 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4886d88-dbe8-48c2-a70d-dfeb1027535a" path="/var/lib/kubelet/pods/a4886d88-dbe8-48c2-a70d-dfeb1027535a/volumes" Mar 11 10:28:11 crc kubenswrapper[4830]: I0311 10:28:11.932782 4830 scope.go:117] "RemoveContainer" containerID="2394cf02223d82ce47a7739bc50289816c5f9b762b4e1497131c114be7d2ef1a" Mar 11 10:28:11 crc kubenswrapper[4830]: E0311 10:28:11.933464 4830 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p7jq8_openshift-machine-config-operator(2bdde2fd-3db4-4b41-9287-58960dcab5d9)\"" pod="openshift-machine-config-operator/machine-config-daemon-p7jq8" podUID="2bdde2fd-3db4-4b41-9287-58960dcab5d9" Mar 11 10:28:15 crc kubenswrapper[4830]: I0311 10:28:15.299797 4830 scope.go:117] "RemoveContainer" containerID="8b59d263de4e240e4a3f54675b2c8f260a88ea7fe9408abdaa50664e8bf0b00e"